Unit: Mechanical Engineering
Program: Mechanical Engineering (BS)
Degree: Bachelor's
Date: Tue Nov 10, 2009 - 9:08:27 am

1) List your program's student learning outcomes (SLOs).

a) An ability to apply knowledge of mathematics, science, and engineering

b) An ability to design and conduct experiments, analyze, and interpret data

c) An ability to design a system, component, or process to meet desired needs

d) An ability to function on multidisciplinary teams

e) An ability to identify, formulate, and solve engineering problems

f) An understanding of professional and ethical responsibility

g) An ability to communicate effectively

h) The broad education necessary to understand the impact of engineering solutions in a social context

i) A recognition of the need and an ability to engage in life-long learning

j) A knowledge of contemporary issues

k) An ability to use the techniques, skills, and modern engineering tools necessary for engineering practice.

2) Where are your program's SLOs published?

Department Website URL: www.me.hawaii.edu
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number: 225
Course Syllabi. URL, if available online: NA

3) Upload your program's current curriculum map(s) as a PDF.

Curriculum Map File(s) from 2009:

4) What percentage of courses have the course SLOs explicitly stated on the course syllabus, department website, or other publicly available document? (Check one)


5) State the SLO(s) that was Assessed, Targeted, or Studied

a through k (see response to Question 1).

6) State the Assessment Question(s) and/or Goal(s) of Assessment Activity

Surveys:  On a scale of 0-1, to what degree the Outcomes a-k  were achieved?

Rubrics: Identify the percentage of students in a class, who fall under the following categories ( for achieving Outcomes a-k):  Poor, Fair, Good, Excellent.

7) State the Type(s) of Evidence Gathered

The data are continually collected at the end of each semester for every class.  In addition, exit interviews of graduating seniors are collected before each graduation.  The department’s Industry Advisory Board (IAB) meets every 18 months and is given an overview of the department.  The IAB also meets with and interviews the Student Advisory Board (SAB) which is a committee composed of the student representatives from the department professional society.  Since many of the IAB members employ our graduates, they are able to provide a written assessment on both our program and the quality of our students.  The SAB also gives the department feedback on the SLOs through written surveys.  In addition, the ASME judges, who are local professional mechanical engineers assess our SLOs through our final capstone senior design courses and provide a written assessment on both our program and the quality of our students.  Finally, our program goes through a rigorous national accreditation program by ABET (www.abet.org) at least once every six years.  Our last accreditation visit was in 2003 and we received a full six-year accreditation, the highest possible.  Our next visit will be in November 2009.  All students enrolled in a class are surveyed.  The results of the class surveys provide feedback to the instructors to enable them to institute changes, as needed.  Faculty direct assessments of the students through their score cards, performance criteria, and rubrics for the SLOs associated to their required courses are also used as feedback for the faculty to improve on the SLOs implementation to ensure that the students are instilled adequately with and trained properly on the SLOs associated with their courses.  The results of the exit interviews enable the chair to assign instructors to classes, modify the required classes, and institute changes in lab policies, etc.  Exit interviews cover approximately 50% of all graduating seniors. Each instructor, the chair, and the accreditation committee examine the results for the class responses.  The chair collects the data and the assessment committee examines the data to provide feedback for implementing changes.  Data are collected in class settings, except for exit interviews where a one-on-one meeting of the graduating seniors and the chair are conducted in the department office.

8) State How the Evidence was Interpreted, Evaluated, or Analyzed

The Mechanical Engineering ABET committee interprets the SLOs evidences for each course through student surveys and the faculty score cards/performance criteria/and rubrics of a- through k listed in 1.  In addition, the SLOs are evaluated by Industry Advisory Board (IAB) members through the evaluation of our student performance by Student Advisory Board (SAB) presentations, IAB/SAB meeting, faculty presentations and laboratory tours.  Further, the SLOs and a major design experience are assessed by the American Society of Mechanical Engineers (ASME) local senior section.  Also, employers of our graduates assess their performance in their organizations and provide performance feedback to the Department.  Finally, the Chair of the Department conducts exit interviews covering the SLOs and anything else the students wish to discuss.  The results obtained from IAB/SAB/ASME/Employers/Alumni/Exit Interviews/Students/Faculty also are interpreted by the ME ABET committee and the ME faculty are provided feedback for curriculum improvement.

9) State How Many Pieces of Evidence Were Collected

I. Internal Assessments:

1. Student Teaching Evaluations (Indirect)

2. Student Assessments on Program Objectives and Outcomes (Indirect)

3. Student Exit Interviews (Direct)

4. Student Advisory Board, SAB, (Indirect)

5. Faculty Score Cards, Performance Criteria, Rubrics (Direct)

6. Course Portfolios (Direct)

II. External Assessments:

1. Industry Advisory Board, IAB, (Direct)

2. Employers of our graduates (Direct)

3. Alumni (Indirect)

4. Capstone Senior Design Evaluation by the ASME Senior Section (Direct)

5. Employers’ Comparative Assessments (Direct)

10) Summarize the Actual Results

•        PROBLEMS (identified through Assessment tools shown above in Section 9):

•        1)  “l” & “m” of our “Outcomes” (which was originally “a thru m”) is basically “Objectives (see O3)” not “Outcome” (IAB, Employers, Exit Interview, DME)

•        2)  “d” of our “Outcome” should be “multidisciplinary teams” rather than “An ability to solve multidisciplinary problems” (IAB, Employers, & DME)

•        3)  Our “Outcome i: A sound basis and motivation to engage in life-long learning” be modified to “A recognition of the need and an ability to engage in life-long learning” (IAB, Employers, Alumni & DME)

•        4)  More “Direct Method” of Assessment by the Faculty should be implemented (IAB, Employers & DME)

•        5)  Students’ Class Surveys are long and some questions are irrelevant (SAB, Students)

•        6)  Slight drop of performance of our alumni compared to our schools (IAB, Employers)

•        7) More emphasis on teaching technical drawings and relevant software (IAB, Employers)

•        8)  Lab hours should be extended (IAB, Employers, SAB, Exit Interview)

•        9 ) More Lab space and updated computers & software are needed for Senior Design Courses (Exit, SAB, IAB, Employers)

•        10) Faculty Shortage in teaching Hands-on courses as well as sufficient courses (including technical electives) for students to be able to graduate timely.  In addition, the freshman design (ME 113) that is no longer being taught, should be reinstated (IAB, Employers, Alumni, SAB, Exit Interview)

•        SOLUTIONS (the numbers here correspond to those in the PROBLEMS one-to-one) (Evaluation and Improvements of the Program):

•        1)  “l” & “m” are dropped from “Outcomes” (see new Outcomes in Section 1)

•        2)  “d” of our “Outcome” changed to “An ability to function on multi-disciplinary teams” (see new Outcomes in Section 1)

•        3)  Our “Outcome I” is changed to “A recognition of the need and an ability to engage in life-long learning” (see new Outcomes in Section 1)

•        4)  More “Direct Method” of Assessment by the Faculty are implemented by setting up direct links between the Objectives-Outcomes-Courses as well as Faculty Score Cards and Rubrics for each individual Objective and related Outcomes, and hence the Faculty H-Form (the form that shows the amount time spent on each Outcome by hour) and Course Syllabi (S-Forms) are accordingly modified.

•          5)  Students’ Class Surveys are shortened to directly address only the relevant Objectives and Outcomes for each individual course consistent with the links between the Objectives-Outcomes-Courses (see Section 3, the Program Curriculum Map) as well as the Faculty Score Cards and Rubrics, H-Forms, and Course Syllabi (SAB, Students).  Note: Our surveys include questions on Program Outcomes (as well as Objectives and the questions on Major Design Experience).

•        6)  We have raised the passing standards in all our fundamental course in ME (Thermofluids, Mechanics, and Materials) by setting a “C-” pre-requisites for these courses (IAB, Employers)

•        7) We are now teaching technical drawings and Solid Works (with COSMOS and COMSOL for FEA Analysis) in ME 481 & ME 482, but need more faculty to re-establish ME 113 and teach these in ME 113 as well as ME 213.

•        8)  The Chair arranged to extend Lab hours.

•        9)  The Chair provided more Lab space and updated computers & software and software for Senior Design Courses.

•        10) Faculty Shortage in teaching Hands-on courses (including the freshman design) and sufficient courses for graduation remains as a Shortage (IAB, Exit Interview).  However, the Dean’s Office has allocated additional three positions for our department, contingent on the availability of funds.

11) Briefly Describe the Distribution and Discussion of Results

The Assessment results are shared with the Department ABET Committee, Faculty, and IAB.  The results are also analyzed by the Department ABET Committee and presented to the Faculty and IAB.  In joint ABET Committee-Faculty-IAB meetings, the results are further analyzed and evaluated, and then solutions for program improvements are proposed, evaluated, and finalized.

12) Describe Conclusions and Discoveries

The Assessment tools have been effective in identifying the existing problems in our department so that the solutions for program improvements can be found with the help of our constituents (Faculty, IAB, and SAB) employing our analysis and evaluation mechanism explained in Section 11.

13) Use of Results/Program Modifications: State How the Program Used the Results --or-- Explain Planned Use of Results

We have Accreditation of our Department Program every six years.  It is required that we at least close the loop of Outcome Assessments, Analysis, Evaluation, and Improvements at least twice during this six-year period.  Therefore, we complete and close this loop every three years.  We use the Program Outcomes, explained here, for our accreditation as well, and employ the Assessment tools, Analysis, Evaluation, and Improvements mechanism described above for our accreditation, as well.

14) Reflect on the Assessment Process

We have recently (past few semesters) adopted the Rubrics for Program Outcome Assessments by the Faculty.  We are looking for more effective ways to implement our Rubrics and use the results from our Rubrics Assessments.

15) Other Important Information

Our last Accreditation Visit was in 2003.  Our next Accreditation Visit is in late November 2009.  We have closed the loop twice (First Loop:  2003-2006, and the Second Loop: 2006-2009).

16) FOR DISTANCE PROGRAMS ONLY: Explain how your program/department has adapted its assessment of student learning in the on-campus program to assess student learning in the distance education program.


17) FOR DISTANCE PROGRAMS ONLY: Summarize the actual student learning assessment results that compare the achievement of students in the on-campus program to students in the distance education program.