Unit: Mechanical Engineering
Program: Mechanical Engineering (BS)
Degree: Bachelor's
Date: Fri Oct 21, 2011 - 4:48:38 am

1) Below are your program student learning outcomes (SLOs). Please update as needed.

 Student Learning Outcomes (SLOs)

a)   An ability to apply knowledge of mathematics, science, and engineering

b)  An ability to design and conduct experiments, analyze, and interpret data

c)   An ability to design a system, component, or process to meet desired needs

d)  An ability to function on multi-disciplinary teams

e)   An ability to identify, formulate, and solve engineering problems

f)   An understanding of professional and ethical responsibility

g)   An ability to communicate effectively

h)  The broad education necessary to understand the impact of engineering solutions in a societal context

i)   A recognition of the need and an ability to engage in life-long learning

j)   A knowledge of Contemporary issues

k)  An ability to use the techniques, skills, and modern engineering tools necessary for engineering practice

ME Department Professional Components/Major Design Experience, PC1 & PC2, (Note: PCs are for Senior Design Courses:  ME 480, 481, & 482):

PC1:   A culminating design experience, that integrates knowledge and skills acquired throughout the curriculum

PC2:   The application of engineering standards and realistic constraints, including consideration of Economics, Environmental Sustainability, Manufacturability, Ethics, Health, Safety, Society, and Politics

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: www.me.hawaii.edu
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number: 227
Course Syllabi. URL, if available online: NA
Other:
Other:

3) Below is the link(s) to your program's curriculum map(s). If we do not have your curriculum map, please upload it as a PDF.

Curriculum Map File(s) from 2011:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) For the period June 1, 2010 to September 30, 2011: State the assessment question(s) and/or assessment goals. Include the SLOs that were targeted, if applicable.

a through k (see response to Question 1) are assessed.  The questions are: to what extent (on a scale 0 to 1) the SLO are achieved based on the learning experiences of the students and their performances either while they are at school (i.e., their performances in their course works) or on a job.

 What did the program want to find out?

Surveys:  On a scale of 0-1, to what degree the Outcomes a-k  were achieved?

Rubrics: Identify the performance of students in a class, who fall under the following categories ( for achieving Outcomes a-k):  Poor, Fair, Good, Excellent (on a scale 1-4).

6) State the type(s) of evidence gathered to answer the assessment question and/or meet the assessment goals that were given in Question #5.

To assess the outcome or answer the assessment question, what evidence was gathered?

The data are continually collected at the end of each semester for every class.  In addition, exit interviews of graduating seniors are collected before each graduation.  The department’s Industry Advisory Board (IAB) meets every 18 months and is given an overview of the department.  The IAB also meets with and interviews the Student Advisory Board (SAB) which is a committee composed of the student representatives from the department professional society.  Since many of the IAB members employ our graduates, they are able to provide a written assessment on both our program and the quality of our students.  The SAB also gives the department feedback on the SLOs through written surveys.  In addition, the ASME judges, who are local professional mechanical engineers assess our SLOs through our final capstone senior design courses and provide a written assessment on both our program and the quality of our students.  Finally, our program goes through a rigorous national accreditation program by ABET (www.abet.org) at least once every six years.  Our last accreditation visit was in November 2009 and we received a full six-year accreditation, the highest possible.  Our next visit will be in November 2015.  All students enrolled in a class are surveyed.  The results of the class surveys provide feedback to the instructors to enable them to institute changes, as needed.  Faculty direct assessments of the students through their score cards, performance criteria, and rubrics for the SLOs associated to their required courses are also used as feedback for the faculty to improve on the SLOs implementation to ensure that the students are instilled adequately with and trained properly on the SLOs associated with their courses.  The results of the exit interviews enable the chair to assign instructors to classes, modify the required classes, and institute changes in lab policies, etc.  Exit interviews cover approximately 50% of all graduating seniors. Each instructor, the chair, and the accreditation committee examine the results for the class responses.  The chair collects the data and the assessment committee examines the data to provide feedback for implementing changes.  Data are collected in class settings, except for exit interviews where a one-on-one meeting of the graduating seniors and the chair are conducted in the department office.

7) State how many persons submitted evidence that was evaluated. If applicable, please include the sampling technique used.

About 240 students, over 10 different companies as employers of our graduates (including, Northrop Grumman, Boeing, HECO, Pearl Harbor, etc.), 20 alumni, 3 ASME Judges, 10 IAB members, 10 SAB members, 40 senior students, 11 faculty, ABET Evaluators, etc.  The following explains the sampling techniques used for our Assessments:

 I. Internal Assessments:

 1. Student Teaching Evaluations (Indirect)

2. Student Assessments on Program Objectives and Outcomes (Indirect)

3. Student Exit Interviews (Direct)

4. Student Advisory Board, SAB, (Indirect)

5. Faculty Score Cards, Performance Criteria, Rubrics (Direct)

6. Course Portfolios (Direct)

II. External Assessments:

1. Industry Advisory Board, IAB, (Direct)

2. Employers of our graduates (Direct)

3. Alumni (Indirect)

4. Capstone Senior Design Evaluation by the ASME Senior Section (Direct)

5. Employers’ Comparative Assessments (Direct)

6. ABET visit (every 3-6 years, depending on the granted accreditation).  We received a 6-year accreditation in 2009, until 2015.

8) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other: Also, Department ABET (Accreditation) Committee & its Chair

9) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

10) For the assessment question(s) and/or assessment goal(s) stated in Question #5:
Summarize the actual results.

Through the evaluation, analysis, and interpretation of the results obtained employing the Assessment Tools explained in Sections 5 through 9, the following Problems were identified:

  • PROBLEMS:
  • 1)  More “Direct Method” of Assessment by the Faculty should be implemented (ABET 2009 visit, IAB, Employers & DME)
  • 2)  The Rubrics should be assessed, tabulated, evaluated, and hence the improvements implemented more directly and objectively based on the SLO’s embedded in the faculty direct assessments (such as use of Rubrics as a grading tool for grading homework, exams, projects, etc.) of the students (ABET 2009 visit).
  • 3) More updated computers & software are needed for DME Design Courses (Exit, SAB, IAB, Employers)
  • 4) More Lab space are needed for the DME Design Projects (Exit, SAB, IAB, Employers)
  • 5)  The machineries and tools in the DME Machine Shop need to be updated for the DME students course projects
  • 6)  There is Faculty Shortage in teaching Hands-on courses as well as sufficient courses (including technical electives) for students to be able to graduate timely.
  • 7)  Due to increase in number of students, more sections of the same course needs to be offered for the students to be able to graduate timely.

11) State how the program used the results or plans to use the results. Please be specific.

The following explains the corrective actions that the DME has taken to resolve the Problems explained in Section 10, and are presented in the following as the Solutions.  Each Solution Number in this section (i.e., Section 11) corresponds to the same Problems Number explained in the previous section (i.e., Section 10).

  • SOLUTIONS (the numbers here correspond to those in the PROBLEMS one-to-one) (Evaluation and Improvements of the Program):
  • 1)  More “Direct Method” of Assessment by the ME Faculty are being implemented by setting up direct links between the Objectives-Outcomes-Courses as well as Faculty Score Cards and Rubrics for each individual Objective and related Outcomes, and hence the Faculty H-Form (the form that shows the amount of time spent on each Outcome by hour) and Course Syllabi (S-Forms) are accordingly being modified.
  • 2)  Direct and objective implementation of the Rubrics based on the SLO’s embedded in the faculty direct assessments (such as use of Rubrics as a grading tool for grading homework, exams, projects, etc.) of the students are being designed for the assessments, tabulations, and evaluations; and will be implemented such that the results can be used for the Continuous Program Improvements.
  • 3)  We have adopted two new software.  First, Solid Works (with COSMOS and COMSOL for FEA Analysis) in ME 213 (Sophomore Design), ME 481 & ME 482 (Senior Design).  Second, ANSYS FLUENT (with Computational Fluids Dynamic capabilities for FEA Analysis) in ME 481 & ME 482 (this is in addition to an existing FEA software on solid modeling and structural analysis, i.e., ANSYS).  The computers in the Holmes Hall 308 (for ME 342, ME 402, and ME 480) have been updated; and there is a plan to update the computers in Holmes Hall 309 (for ME 213, ME 481, and ME 482).
  • 4)  The Chair has provided more Lab space for students’ projects in Holmes Hall 140.
  • 5)  The Department has purchased new machineries and tools in the DME Machine Shop for the student course projects.  They will arrive sometime during Fall 2011 semester, and will be installed before the end of Fall 2011 semester.
  • 6) The Faculty Shortage in teaching Hands-on courses and sufficient courses for the students’ timely graduation (IAB, Exit Interview, ABET 2009 visit) is being resolved by hiring two new assistant professors in Fall 2011 and a third one by Spring 2012.  The number of Faculty in the Department was 13 at the time of ABET Visit in Nov. 2009 and we pointed out to ABET that the Department should receive 3 more faculty to bring the number of faculty to 16.  The number of faculty was reduced to 11, in Fall 2010, due to one faculty resignation and another faculty retirement.  The three new assistant professors in the Department raises the number of faculty from 11 in Fall 2010 to 14 by Spring 2012.  According to our ABET visit of Nov. 2009, we need to more faculty positions.  Although, the Dean’s office is committed to give DME additional two positions; however, due to the economic down-turn, these positions are contingent to the availability of funds.  In addition, Dean’s Office provided the Departments in the College to hire some lecturers and graders to offer more courses and help with more sections of the courses.
  • 7)  Due to increase in number of students, more sections of the same course are needed for the students to be able to graduate timely.  Therefore, by the hire of new student professors, as well as lecturers and graders, we have been able to remedy some of the problems associated with insufficient courses and sections offering for the students to graduate in time.

12) Beyond the results, were there additional conclusions or discoveries?
This can include insights about assessment procedures, teaching and learning, program aspects and so on.

Overall, the Assessment tools have been effective in identifying the existing problems in our department so that the solutions for program improvements can be found with the help of our constituents (i.e., Faculty, IAB, SAB, and students) employing our assessments, analyses, and evaluation mechanism explained in previous sections.  In addition, based on our accreditation team (ABET 2009 visit team members and evaluator) recommendations in November 2009, who identified that the Rubrics performed by our faculty to assess the students on the SLOs, were not direct and objective and they required more objective and direct use of Rubrics (such as use of Rubrics as a grading tool for grading homework, exams, projects, etc.), we are in the process of modifying and implementing such techniques to remedy this issue.

13) Other important information.
Please note: If the program did not engage in assessment, please explain. If the program created an assessment plan for next year, please give an overview.

 Our last Accreditation Visit was in November 2009.  The comments were to use more direct and objective ways to implement the Rubric, to ensure that the Department budget is sufficient to operate the Department effectively and hire new needed faculty, and to ensure that there are not disparities and inequities in terms of salaries among the College of Engineering Faculty Salaries. 

We have Assessment Tools in place (as explained in this report) and according to the ABET need to perform continuous program assessments, analyses, evaluations, and program improvements.  Therefore, this is an on-going and recurring process.