Departmental Assessment Update - Engineering Report

Department: Civil and Environmental Engineering
Program: BSCE
Level: Undergraduate

1. List in detail your undergraduate Student Learning Outcomes (SLOs) for each degree/certificate offered.

The student learning outcomes describe what students are expected to know and be able to do at the time of graduation. They are:

  • a. an ability to apply knowledge of mathematics, science, and engineering
  • b. an ability to design and conduct experiments, as well as to analyze and interpret data
  • c. an ability to design a system, component, or process to meet desired needs
  • d. an ability to function on multi-disciplinary teams
  • e. an ability to identify, formulate, and solve engineering problems
  • f. an understanding of professional and ethical responsibility
  • g. an ability to communicate effectively
  • h. the broad education necessary to understand the impact of engineering solutions in a global, societal, and environmental context
  • i. a recognition of the need for, and an ability to engage in, life-long learning
  • j. a knowledge of contemporary issues
  • k. an ability to use the techniques, skills, and modern engineering tools necessary for engineering practice, particularly recognizing the integral role of computers in engineering and the rapid expansion of resources on the Internet.

2. Where are these SLOs published (e.g., department web page)?

The SLOs are published on the department's web page.

3. Explain how your SLOs map onto your curriculum, i.e., how does your curriculum produce the specific SLOs in your students?

Table 1 shows the linkages between the curriculum and program outcomes. Courses are linked to the program outcomes using a scale based on the contribution of each course.

Table 1 Core curriculum linkages to program outcomes

Course Outcome(emphasis)

Freshman

Eng 100: f(1), g(3), h(2), k(2)

Math 241: a(3), f(1), k(2)

Chem 161 & 161L: a(3), b(2), f(1), k(1)

Sp 251: f(1), g(3), h(2)

Math 242 & 242L: a(3), f(1), k(2)

Phys 170 & 170L: a(3), b(3), f(1), k(2)

Chem 162: a(3), f(1), k(1)

EE 160/ICS 11: a(3), c(1), f(1), h(1), k(3)

Sophomore

CEE 270: a(3), c(1), e(1), f(1), k(2)

Math 243: a(3), f(1), k(2)

Phys 272 & 272L: a(3), b(3), c(1), f(1), k(2)

Hist 151: f(1), h(3)

Humanities elective: f(1), h(3), j(3)

CEE 271: a(3), c(1), e(3), f(1), g(1), i(1), k(1)

Math 244: a(3), f(1), k(2)

Hist 152: f(1), h(3)

Economics elective: f(1), h(3), j(3)

Social science elective: f(1), h(3), j(3)

Junior

CEE 305: a(3), b(1), c(1), e(2), f(1), h(1), k(1)

CEE 320: a(3), b(3), c(1), d(1), e(3), f(1), g(3), h(1), i(3), j(1), k(1)

CEE 361: a(3), b(1), c(2), e(2), f(1), h(2), j(1), k(2)

CEE 370 & 370L: a(2), b(3), c(1), d(1), e(2), f(1), j(1), k(2)

Math 302/307; ME 403; GG 312: a(3), f(1), k(2)

CEE 330: a(3), b(2), c(1), d(1), e(2), f(1), g(1), h(2), i(1), j(2), k(3)

CEE 355: a(3), b(3), c(2), d(3), e(3), f(1), g(3), h(1), i(1), j(1), k(3)

CEE 375: a(1), b(3), c(2), d(1), e(1), f(1), g(1), h(1), i(2), j(2), k(2)

CEE 381: a(3), c(1), e(3), f(1), g(1), h(1), i(2), k(3)

Senior

CEE 421: a(3), b(3), c(3), d(1), e(3), f(1), g(2), h(2), i(2), j(1), k(3)

CEE 431: a(3), b(1), c(3), d(1), e(3), f(1), g(2), h(3), i(2), j(3), k(3)

CEE 462 a(3), b(1), c(2), d(2), e(2), f(2), g(3), h(3), i(2), j(2), k(3)

or

CEE 464: a(3), b(1), d(2), e(2), f(1), g(3), h(3), i(1), j(1), k(2)

CEE 472: a(1), c(1), d(1), e(1), f(2), g(2), h(3), i(3), j(2), k(2)

or

CEE 473: a(1), c(2), e(1), f(1), g(1), h(1), i(2), j(2), k(1)

or

CEE 474: a(1), b(1), c(2), e(2), f(2), g(2), h(2), i(2), j(2), k(2)

CEE 485: a(3), b(2), c(3), d(1), e(3), f(1), g(1), i(1), k(1)

CEE 455: a(3), b(1), c(2), d(1), e(2), f(2), j(1), k(1)

CEE 490: a(3), b(2), c(3), d(3), e(3), f(3), g(3), h(2), i(2), j(2), k(3)

Biological science elective: a(3), f(1), h(3)

Note: 1 = some emphasis; 2 = moderate emphasis; 3 = significant emphasis; missing outcome = no emphasis

4. What specific methodologies were used to collect data? In developing your response, consider the following questions:

Each outcome is assessed using multiple (at least 2) methods as shown in Table 2. The Department’s four assessment tools are described in this section. The tools are: exit surveys (ES), exit interviews (EI), FEexam (FE), and design portfolios (DF).

Table 2 Mapping of program educational outcomes and assessment methods

  • a. math, science & engineering: ES, FE, DP
  • b. design & conduct experiments: ES, EI
  • c. design system, component, process: ES, DP
  • d. function on multi-disciplinary teams: ES, DP
  • e. identify, formulate & solve eng. problems: ES, DP
  • f. professional & ethical responsibility: ES, EI, FE
  • g. communicate effectively: ES, EI, DP
  • h. understand impact in global & societal context: ES, DP
  • i. life-long learning: ES, EI
  • j. knowledge of contemporary issues: EI, DP
  • k. use techniques, skills and tools for eng. practice: ES, FE, DP

A. Exit Surveys

Exit surveys have been distributed to graduating students since Fall 1998. The initial survey was a general questionnaire regarding students’ perceptions of their undergraduate program and their preparedness for engineering employment. The survey instrument was revised in 2000 and again in 2002 to provide more direct feedback on the program outcomes. The current exit survey consists of a one-page questionnaire that asks all graduating seniors to share their perception of the level of personal achievement of outcomes a through k after going through our program (the earlier survey instruments will be available for review at the site visit). Students are also asked to provide additional comments on the program. As part of our continuous quality improvement, this assessment tool is being modified.

B. Exit Interviews

Exit interviews are conducted as a follow-up to exit surveys. This involves a group face-to-face dialog with all seniors a few days prior to graduation. The focus of the exit interview is to request for students to expand on outcomes b, f, g, i and j. As part of our continuous quality improvement, this assessment tool is being modified.

Minutes of the exit interviews are recorded every spring and fall semester. Summer graduates participate in the spring interview. This assessment method was first performed during the Spring 2002 semester. Typically, about 90% of the graduating seniors attend the exit interview over lunch.

C. Commercial, Norm-referenced, Standardized Examinations

The Fundamentals of Engineering (FE) Exam is a nationally designed and administered exam that allows us to assess our students’ capabilities, as they near graduation, relative to students across the U.S. From a civil engineering student’s perspective, the FE is important because it is the first step towards licensure as a professional engineer (P.E.). The FE Exam is administered such that all students take a common general morning session, but the students have the option in the afternoon session of either additional general questions or civil engineering specific questions.

The FE exam is offered twice a year, in April and October. The exam is not required, but students are encouraged to take it. Evaluation of the scores provides a direct feedback regarding areas of study in which our graduates are competent, as well as areas that should be improved. The FE exam is used to assess outcomes a (math, science and engineering), f (professional and ethical responsibility) and k (techniques, skills and tools for engineering practice). The Department has discussed making this exam mandatory for graduation, although our participation rates are so high that this would not add much.

D. Design Portfolios

These are collections of all work compiled during the capstone course (CEE 490) that is mandatory for all students in their senior year. Therefore, the portfolios are used to assess outcomes close to graduation. Multiple outcomes can be assessed using design portfolios. We use portfolios to assess outcomes a, c, d, e, g, h, j and k. Assessment is performed by a panel of six professional practitioners. The results of this rating are then reported to the program assessment committee. One advantage of using design portfolios is that it minimizes test anxiety and other “one shot” measurement problems.

5. How were the assessment data/results used to inform decisions concerning the curriculum and administration of the program?

Evaluation of the program outcomes is facilitated using the four assessment methods. Steps in the evaluation process are as follows:

1. Set the performance criterion – The performance criterion for each outcome varies depending on the assessment method. A summary of the various performance criteria is given in Table 3.
2. Strategy for assessment – The strategy for assessment is through written exit surveys, in-person exit interviews, evaluation of FE Exam results, and evaluation of capstone design course portfolios.
3. Frequency of assessment - The frequency of assessment is every semester with the exception of the design portfolio, which is performed during each spring semester.
4. Responsibility for assessment – Different faculty members are assigned as champions for each assessment method (see Table 3).
5. Reporting of assessment results – The assigned champions report the assessment results to the Department’s program assessment committee.
6. Recommendations and action – The program assessment committee makes a report and recommendations to the faculty for maintaining, modifying or improving the curricula and the assessment process including modifications to the program objectives and/or outcomes. The Department reviews the report and acts on the recommendations. This iterative process results in continuous quality improvement.

 

Changes that have resulted as a part of this process include:

  • CEE 490 was made mandatory (Fall 2002)
  • Structural and environmental tracks were added at the senior level
  • Lab courses are "permanently" writing intensive (WI), so students can fulfill their WI requirements by taking required courses
  • Degree requirements are continually evaluated and modified as necessary (ICS 111 and Math 307 were made acceptable alternates effective 8/2006)

 

6. Has the program developed learning outcomes? Please indicate yes or no.

Yes.

7. Has the program published learning outcomes? Please indicate yes or no.

Yes.

8. If so, please indicate how the program has published learning outcomes.

The SLOs are published on the department's web page.

9. What evidence is used to determine achievement of student learning outcomes?

See question 4.

10. Who interprets the evidence?

The faculty interpret most of the evidence. There is also a review panel for the senior design that is composed of practicing engineers.

11. What is the process of interpreting the evidence?

We have a detailed set of performance criteria for each outcome. These outcomes are assessed on a scheduled, on-going basis.

12. Indicate the date of last program review.

The program was last reviewed in the fall of 2003 by ABET, the engineering accrediting organization.