Program: Mechanical Engineering (BS)
Degree: Bachelor's
Date: Mon Nov 15, 2010 - 8:55:21 am
1) Below are the program student learning outcomes submitted last year. Please add/delete/modify as needed.
a) An ability to apply knowledge of mathematics, science, and engineering
b) An ability to design and conduct experiments, analyze, and interpret data
c) An ability to design a system, component, or process to meet desired needs
d) An ability to function on multidisciplinary teams
e) An ability to identify, formulate, and solve engineering problems
f) An understanding of professional and ethical responsibility
g) An ability to communicate effectively
h) The broad education necessary to understand the impact of engineering solutions in a social context
i) A recognition of the need and an ability to engage in life-long learning
j) A knowledge of contemporary issues
k) An ability to use the techniques, skills, and modern engineering tools necessary for engineering practice.
2) As of last year, your program's SLOs were published as follows. Please update as needed.
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox.png)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
3) Below is the link to your program's curriculum map (if submitted in 2009). If it has changed or if we do not have your program's curriculum map, please upload it as a PDF.
- File (03/16/2020)
4) The percentage of courses in 2009 that had course SLOs explicitly stated on the syllabus, a website, or other publicly available document is indicated below. Please update as needed.
![](images/radio_empty.png)
![](images/radio_empty.png)
![](images/radio_empty.png)
![](images/radio_empty.png)
![](images/radio.png)
5) State the assessment question(s) and/or goals of the assessment activity. Include the SLOs that were targeted, if applicable.
a through k (see response to Question 1) are assessed. The questions are: to what extent (on a scale 0 to 1) the SLO are achieved based on the learning experiences of the students and their performances either while they are at school (i.e., their performances in their course works) or on a job.
What did the program want to find out?
Surveys: On a scale of 0-1, to what degree the Outcomes a-k were achieved?
Rubrics: Identify the performance of students in a class, who fall under the following categories ( for achieving Outcomes a-k): Poor, Fair, Good, Excellent (on a scale 1-4).
6) State the type(s) of evidence gathered.
The data are continually collected at the end of each semester for every class. In addition, exit interviews of graduating seniors are collected before each graduation. The department’s Industry Advisory Board (IAB) meets every 18 months and is given an overview of the department. The IAB also meets with and interviews the Student Advisory Board (SAB) which is a committee composed of the student representatives from the department professional society. Since many of the IAB members employ our graduates, they are able to provide a written assessment on both our program and the quality of our students. The SAB also gives the department feedback on the SLOs through written surveys. In addition, the ASME judges, who are local professional mechanical engineers assess our SLOs through our final capstone senior design courses and provide a written assessment on both our program and the quality of our students. Finally, our program goes through a rigorous national accreditation program by ABET (www.abet.org) at least once every six years. Our last accreditation visit was in November 2009 and we received a full six-year accreditation, the highest possible. Our next visit will be in November 2015. All students enrolled in a class are surveyed. The results of the class surveys provide feedback to the instructors to enable them to institute changes, as needed. Faculty direct assessments of the students through their score cards, performance criteria, and rubrics for the SLOs associated to their required courses are also used as feedback for the faculty to improve on the SLOs implementation to ensure that the students are instilled adequately with and trained properly on the SLOs associated with their courses. The results of the exit interviews enable the chair to assign instructors to classes, modify the required classes, and institute changes in lab policies, etc. Exit interviews cover approximately 50% of all graduating seniors. Each instructor, the chair, and the accreditation committee examine the results for the class responses. The chair collects the data and the assessment committee examines the data to provide feedback for implementing changes. Data are collected in class settings, except for exit interviews where a one-on-one meeting of the graduating seniors and the chair are conducted in the department office.
7) Who interpreted or analyzed the evidence that was collected?
![](images/checkbox.png)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
8) How did they evaluate, analyze, or interpret the evidence?
![](images/checkbox.png)
![](images/checkbox.png)
![](images/checkbox.png)
![](images/checkbox.png)
![](images/checkbox.png)
![](images/checkbox.png)
![](images/checkbox.png)
9) State how many persons submitted evidence that was evaluated.
If applicable, please include the sampling technique used.
About 290 students, over 10 different companies as employers of our graduates (including, Northrop Grumman, Boeing, HECO, Pearl Harbor, etc.), 20 alumni, 6 ASME Judges, 10 IAB members, 10 SAB members, 40 senior students, 13 faculty, 2 ABET Evaluators, etc. The following explains the sampling techniques used for our Assessments:
I. Internal Assessments:
1. Student Teaching Evaluations (Indirect)
2. Student Assessments on Program Objectives and Outcomes (Indirect)
3. Student Exit Interviews (Direct)
4. Student Advisory Board, SAB, (Indirect)
5. Faculty Score Cards, Performance Criteria, Rubrics (Direct)
6. Course Portfolios (Direct)
II. External Assessments:
1. Industry Advisory Board, IAB, (Direct)
2. Employers of our graduates (Direct)
3. Alumni (Indirect)
4. Capstone Senior Design Evaluation by the ASME Senior Section (Direct)
5. Employers’ Comparative Assessments (Direct)
6. ABET visit (every 2-6 years, depending on the granted accreditation, we received a 6-year accreditation in 2009, until 2015)
10) Summarize the actual results.
Through the evaluation, analysis, and interpretation of the results obtained the Assessment Tools explained in Section 9, the following Problems were identified:
• PROBLEMS (identified through Assessment tools shown above in Section 9):
• 1) More “Direct Method” of Assessment by the Faculty should be implemented (ABET 2009 visit, IAB, Employers & DME)
• 2) The Rubrics should be assessed, tabulated, evaluated, and hence the improvements implemented more directly and objectively based on the SLO’s embedded in the faculty direct assessments (such as use of Rubrics as a grading tool for grading homework, exams, projects, etc.) of the students (ABET 2009 visit).
• 3) More updated computers & software are needed for DME Design Courses (Exit, SAB, IAB, Employers)
• 4) More Lab space are needed for the DME Design Projects (Exit, SAB, IAB, Employers)
• 5) The machineries and tools in the DME Machine Shop need to be updated for the DME students course projects
• 6) There is Faculty Shortage in teaching Hands-on courses as well as sufficient courses (including technical electives) for students to be able to graduate timely. In addition, the freshman design course, ME 113, that is no longer being taught, should be reinstated (IAB, Employers, Alumni, SAB, Exit Interview)
11) How did your program use the results? --or-- Explain planned use of results.
Please be specific.
The following explains the corrective actions that the DME took to resolve the Problems explained in Section 10, and are presented in the following as the Solutions. Each Solution Number in this section (i.e., Section 11) corresponds to the same Problems Number explained in the previous section (i.e., Section 10).
• SOLUTIONS (the numbers here correspond to those in the PROBLEMS one-to-one) (Evaluation and Improvements of the Program):
• 1) More “Direct Method” of Assessment by the ME Faculty are being implemented by setting up direct links between the Objectives-Outcomes-Courses as well as Faculty Score Cards and Rubrics for each individual Objective and related Outcomes, and hence the Faculty H-Form (the form that shows the amount of time spent on each Outcome by hour) and Course Syllabi (S-Forms) are accordingly being modified.
• 2) Direct and objective implementation of the Rubrics based on the SLO’s embedded in the faculty direct assessments (such as use of Rubrics as a grading tool for grading homework, exams, projects, etc.) of the students are being designed for the assessments, tabulations, and evaluations; and will be implemented such that the results can be used for the Continuous Program Improvements.
• 3) We have adopted two new software. First, Solid Works (with COSMOS and COMSOL for FEA Analysis) in ME 213 (Sophomore Design), ME 481 & ME 482 (Senior Design). Second, ANSYS FLUENT (with Computational Fluids Dynamic capabilities for FEA Analysis) in ME 481 & ME 482 (this is in addition to an existing FEA software on solid modeling and structural analysis, i.e., ANSYS). The computers in the DME labs have been updated.
• 4) The Chair is in the process of providing more Lab space for students’ projects.
• 5) The Chair is in the process of updating the machineries and tools in the DME Machine Shop for the students course projects
• 6) The Faculty Shortage in teaching Hands-on courses (including the freshman design) and sufficient courses for the students’ timely graduation remains as a Shortage (IAB, Exit Interview, ABET 2009 visit). In addition, one of our faculty resigned and another retired. We have received these two positions back due to severity of the Faculty Shortage in DME. In addition, to address some of the ABET 2009 visit comments/concerns, we also received a third position to fill the position of one of the faculty who retired over a year ago. Although, the Dean’s office is committed to give DME additional two positions; however, due to the economic down-turn, these positions are contingent to the availability of funds. The DME needs more faculty to re-establish ME 113 and teach some of the computer aided design software in both ME 113 and ME 213.
12) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, program aspects and so on.
Overall, the Assessment tools have been effective in identifying the existing problems in our department so that the solutions for program improvements can be found with the help of our constituents (i.e., Faculty, IAB, and SAB) employing our analysis and evaluation mechanism explained in previous sections. However, there was one specific discovery made primarily by our accreditation team (ABET 2009 visit team members and evaluator) in November 2009, who identified that the Rubrics performed by our faculty to assess the students on the SLOs, were not objective and direct and they required more objective and direct use of Rubrics (such as use of Rubrics as a grading tool for grading homework, exams, projects, etc.) to remedy this issue. We are in the process of modifying and implementing such techniques.
13) Other important information:
Our last Accreditation Visit was in November 2009. The comments were to use more direct and objective ways to implement the Rubric, to ensure that the Department budget is sufficient to operate the Department effectively, and to ensure that there are not disparities and inequities in terms of salaries among the College of Engineering Faculty Salaries.