Program: Electrical Engineering (BS)
Degree: Bachelor's
Date: Mon Oct 19, 2009 - 12:47:16 pm
1) List your program's student learning outcomes (SLOs).
All graduates of the Electrical Engineering Program are expected to have:
- Knowledge of probability and statistics, including examples relevant to Electrical Engineering (program criteria). Knowledge of mathematics through differential and integral calculus, basic sciences, and engineering sciences necessary to analyze and design complex devices and systems containing hardware and software. Knowledge of advanced mathematics, including differential equations (program criteria).
- Demonstrated an ability to design and conduct experiments, as well as to interpret data.
- Demonstrated an ability to design a system or component that meets a specified need.
- Demonstrated an ability to function in a multi-disciplinary team.
- Demonstrated an ability to identify, formulate and solve electrical engineering problems.
- Understanding of professional and ethical responsibility.
- Demonstrated an ability to communicate effectively (written and oral).
- Demonstrated an understanding of the impact of engineering solutions in a global and societal context.
- Recognition of the need for life-long learning.
- Demonstrated a knowledge of contemporary issues.
- Demonstrated an ability to use the techniques, skills, and modern tools necessary for engineering practice.
2) Where are your program's SLOs published?
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number: http://www.catalog.hawaii.edu/schoolscolleges/engineer/ee.htm
Course Syllabi. URL, if available online:
Other:
Other:
3) Upload your program's current curriculum map(s) as a PDF.
- File (03/16/2020)
4) What percentage of courses have the course SLOs explicitly stated on the course syllabus, department website, or other publicly available document? (Check one)
1-50%
51-80%
81-99%
100%
5) State the SLO(s) that was Assessed, Targeted, or Studied
All our SLOs are assessed every semester.
The following are different methods we use to evaluate SLOs:
Industrial Advisory Board (IAB) Reports:
Course Surveys:
EE 496 Senior Capstone Design Reviews:
Measuring Outcomes Achievement in Courses:
EBI Survey:
6) State the Assessment Question(s) and/or Goal(s) of Assessment Activity
Industrial Advisory Board (IAB) Reports: During the IAB meetings, the board members complete a survey of how our students achieve each of our Outcomes with ratings 1 (poor), 2 (marginal), 3 (satisfactory), and 4 (exemplary).
Course Surveys: At the end of every course, the students are surveyed about how the course helps achieve each of the Outcomes with ratings 0 (not at all), 1 (insignificant), 2 (moderate), and 3 (significant). Every year, the data is used to estimate for each Outcome the number of Contributing Courses from the Student Experience. There is a similar estimate done by the faculty from the instructor’s perspective referred to as Contributing Courses from the Faculty Expectations. The two are compared to determine the level of student achievement.
EE 496 Senior Capstone Design Reviews: The faculty annually reviews a random sample of EE 496 Senior Capstone Design projects through surveys. For each project, there are two faculty reviewers: the faculty project advisor and an “outside reviewer”.
Measuring Outcomes Achievement in Courses: The performance criteria and scoring rubrics for each SLO is done by expanding each SLO into measurable criteria. These criteria are then measured in different EE undergraduate courses. Instructors make direct assessments about how students are achieving these criteria by using the scoring rubrics.
EBI Survey: EBI conducted an exit survey of our recent graduates in Fall 2008 where 19 students participated.
7) State the Type(s) of Evidence Gathered
The different assessment methods was stated in the answer to question 6. Evidence gathered was data
rating the achievement of each outcome.
8) State How the Evidence was Interpreted, Evaluated, or Analyzed
We have an assessment committee to look at data and interpret the results. Data was gathered
and analyzed statistically. To assess accuracy of results we look at how outcomes are rated using
different assessment methods.
The committee looking at this was
our ABET (Engineering accreditation) committee. The faculty members involved are Profs. Gurdal
Arslan, Luca Macchiarulo, Galen Sasaki, Anders Host Madsen, David Garmire, Yingbin Liang, and
Anthony Kuh (chair of EE dept.).
9) State How Many Pieces of Evidence Were Collected
We used five different methods to assess SLO.
Industrial Advisory Board (IAB) Reports:
Meet every year to two years. 5 to 10 IAB members participate in the meetings and are surveyed as to level of SLO.
Course Surveys:
All undergraduate classes are assessed every semester to evaluate level of SLO. Each student in every class fills out surveys.
EE 496 Senior Capstone Design Reviews:
Project reports are sampled every YEAR. Six reports are chosen and evaluated by project advisor and another faculty as to level of SLOs.
Measuring Outcomes Achievement in Courses:
On average each undergraduate course is sampled once every three years to measure students achievements on specific SLOs. We just started doing this assessment in 2008.
EBI Survey:
These surveys are sent to alumni and are done once every six years.
10) Summarize the Actual Results
Table 1 summarizes the student level of achievement of Outcomes using the different assessment methods, where “3” = exemplary, “2” = satisfactory, “1” = marginal, and “0” = unsatisfactory. The overall level of student achievement is shown at the bottom row of the table. The two assessment methods that are direct assessments are shaded in the table.
Table 1. Level of student achievement of Outcomes.
Assessment Methods |
Outcome |
||||||||||
1 |
2 |
3 |
4 |
5 |
6 |
7 |
8 |
9 |
10 |
11 |
|
IAB Reports |
3 |
2 |
2 |
1 |
2 |
1 |
1 |
0 |
1 |
1 |
2 |
Course Surveys |
2 |
2 |
2 |
2 |
2 |
2 |
2 |
2 |
2 |
2 |
2 |
EE 496 Senior Capstone Design Reviews [Direct Assessment] |
1 |
2 |
2 |
1 |
2 |
1 |
2 |
1 |
2 |
1 |
2 |
Measuring Outcome Achievement in EE Courses (rubrics) [Direct Assessment] |
2 |
2 |
2 |
3 |
3 |
||||||
EBI Exit Survey |
3 |
3 |
3 |
2 |
3 |
0 |
2 |
3 |
2 |
1 |
2 |
Overall Level of Achievement |
2 |
2 |
2 |
2 |
2 |
1 |
2 |
1 |
2 |
1 |
2 |
The Table shows that Outcome 8 has an overall level of achievement of 1 (“marginal”). Though there is some emphasis in the curriculum as shown in Figure 3, there is no strong emphasis in any EE course. Note that this Outcome has some emphasis in some of the non-EE courses which is not considered in the evaluation in Table 1. Note that we do not yet Measure this Outcome Achievement in an EE Course. However, there is a proposal for this, which is a research and writing assignment. The proposal is described in the section “Proposal to Measure Outcome 8” of this folder.
11) Briefly Describe the Distribution and Discussion of Results
see 12) ABET committee and EE faculty received results. Discussion took place in faculty meeting and weekly ABET meetings.
12) Describe Conclusions and Discoveries
Appendix A. IAB Reports
During Industrial Advisory Board (IAB) meetings, the board members complete a survey of how our students achieve our Outcomes giving ratings from 1 (“not at all”) to 4 (“a great deal”). Table A.1 has the average ratings from the 2005 and 2007 IABs as well as the overall average of the ratings. This overall average is translated into a level of achievement as follows:
Avg. score from Table A.1 |
Level of student achievement |
3.5 |
3 (“Exemplary”) |
3.0 |
2 (“Satisfactory”) |
2.5 |
1 (“Marginal”) |
2.0 |
0 (“Poor”) |
Table A.1. 2005 and 2007 IAB average ratings) of level of achievement of Outcomes.
Outcomes |
Avg 2005 |
Avg 2007 |
Avg Over-all |
Level of Achieve-ment |
1. Knowledge of probability and statistics, including examples relevant to Electrical Engineering (program criteria). Knowledge of mathematics through differential and integral calculus, basic sciences, and engineering sciences necessary to analyze and design complex devices and systems containing hardware and software. Knowledge of advanced mathematics, including differential equations (program criteria). |
3.70 |
3.56 |
3.63 |
3 |
2. Demonstrated an ability to design and conduct experiments, as well as to interpret data. |
3.35 |
3.33 |
3.34 |
2 |
3. Demonstrated an ability to design a system or component that meets a specified need. |
3.52 |
3.33 |
3.43 |
2 |
4. Demonstrated an ability to function in a multi-disciplinary team. |
2.87 |
2.94 |
2.91 |
1 |
5. Demonstrated an ability to identify, formulate and solve electrical engineering problems. |
3.52 |
3.22 |
3.37 |
2 |
6. Understanding of professional and ethical responsibility. |
2.45 |
2.78 |
2.62 |
1 |
7. Demonstrated an ability to communicate effectively (written and oral). |
2.78 |
2.56 |
2.67 |
1 |
8. Demonstrated an understanding of the impact of engineering solutions in a global and societal context. |
2.25 |
2.61 |
2.43 |
0 |
9. Recognition of the need for life-long learning. |
3.00 |
2.89 |
2.95 |
1 |
10. Demonstrated a knowledge of contemporary issues. |
2.70 |
3.11 |
2.91 |
1 |
11. Demonstrated an ability to use the techniques, skills, and modern tools necessary for engineering practice. |
3.52 |
3.22 |
3.37 |
2 |
Appendix B. Course Surveys
Our department surveys the students of all undergraduate courses every semester. At the end of every course, students rate how the course contributes to each outcome: 3 = “significant”, 2 = “moderate”, 1 = “insignificant, and 0 = “not at all”. For each course, the average scores are translated to a fraction
F = max{average rating - 1, 0}/2,
which corresponds to
Average Rating |
F |
3 (“significant”) |
1.0 |
2 (“moderate”) |
0.5 |
1 (“insignificant”) |
0.0 |
0 (“not at all”) |
0.0 |
F is interpreted as the fraction of the course that contributes significantly to the Outcome. Thus, F = 1 implies that the entire course contributes, while F = 0.5 implies that half of the course contributes.
Given these estimates of F, we can estimate how many EE required 3-credit courses in Figures 1 and 2 contribute significantly to an Outcome. For example, if there are 3 courses that have F = 1, 2 courses that have F = 0.5, and the rest of the courses have F = 0 then the number of courses is 3 x 1 + 2 x 0.5 = 4. We refer to the estimate as the number of contributing courses as experienced by students. Note that we have estimates per Track, which covers all the courses required by all EEs (in Figure 1) and courses required per Track (in Figure 2). For example, a Computer Track student must take all courses in Figure 1 and all Computer Track Group I courses in Figure 2.
Note that 1 credit courses count as 1/3 of a 3-credit course, and 4 credit courses count as 4/3 of a 3-credit course. Also note that student project courses, EE 296, 396, and 496, were not counted since the student experience can vary quite a bit depending on the project.
Figure B.1 has the number of contributing courses as experienced by students in academic year 2005-06. We should note that we did have some occasional missing data. Since this missing data does not add to the contributing courses, our estimates are conservative.
We also did a similar estimate of contributing courses but from the perspective of faculty instructors. Faculty members, and in particular course coordinators, were given surveys to rate how each course contributes to Outcomes. The data was used to estimate the number of contributing courses from faculty expectations which is shown in Figure B.2.
We compared the two estimates of contributing courses by subtracting faculty expectations from student experiences. Figures B.3, B.4, and B.5 has these differences for 2005-06, 2006-07, and 2007-08. Student experiences generally meet or exceed faculty expectations for all Outcomes. (The only cases when student experience was below faculty expectations were for Outcome 3; but as Figure B.3 shows, faculty expectations are quite high for the Outcome and difficult to achieve.) Thus, we translate the results of the figures to a student level of achievement of 2 (“satisfactory”) for all Outcomes.
Figure B.1. Contributing Track Required for 2005-06 Academic Year.
Appendix C. EE 496 EE Senior Capstone Design Reviews
Table C.1 shows the results over two years. In the first year (2005-06), only three project reports were reviewed, and six project reports were reviewed. Per year, scores were average over the number of respondents. Note that some faculty did not provide a rating to every Outcome, e.g., if the faculty member did not feel qualified to provide a rating.
There is a column with an overall average score over the two years. The overall scores were then translated into student level of achievement as follows:
Avg. score from Table C.1 |
Level of student achievement |
3.5 |
3 (“Exemplary”) |
3.0 |
2 (“Satisfactory”) |
2.5 |
1 (“Marginal”) |
2.0 |
0 (“Poor”) |
Note that the average scores can range from a minimum of 1 to a maximum of 4 because the individual scores are: 1 (“None”), 2 (“Marginal”), 3 (“Good”) to 4 (“Very Good”). Thus, this is a conservative translation from the average scores to level of student achievement.
Table C.1 not shown here as table is too big, but SLOs all have ratings of satisfactory or marginal.
Appendix D. EBI Surveys
EBI conducted an exit survey of our recent graduates in Fall 2008 where 19 students participated. The survey had questions relating to the eleven Outcomes, where some Outcomes had multiple related questions. For each question, numerical scores given by our graduates were averaged and compared to 68 other institutions. The comparison led to a ranking from 1 (best) to 68 (worst). We translated these rankings into a level of achievement where
Avg. ranking |
Upper percentage |
Level of student achievement |
1 - 17 |
Top 25% |
3 (“Exemplary”) |
18 - 34 |
Second 25% |
2 (“Satisfactory”) |
35 - 51 |
Third 25% |
1 (“Marginal”) |
52 - 58 |
Bottom 25% |
0 (“Poor”) |
Table D.1 has the resulting student level of achievement per Outcome.
Table D.1. Level of achievement from EBI exit survey.
Outcome |
Average Ranking |
Upper Percentage |
Level of Achievement |
1 |
8.0 |
12% |
3 |
2 |
8.7 |
13% |
3 |
3 |
12.0 |
18% |
3 |
4 |
28.0 |
41% |
2 |
5 |
16.7 |
25% |
3 |
6 |
56.0 |
82% |
0 |
7 |
29.5 |
43% |
2 |
8 |
14.0 |
21% |
3 |
9 |
34.0 |
50% |
2 |
10 |
40.0 |
59% |
1 |
11 |
17.0 |
25% |
2 |
13) Use of Results/Program Modifications: State How the Program Used the Results --or-- Explain Planned Use of Results
This refers to our ABET committee which assesses and evaluates SLOs designated as outcomes here.
After evaluating the data about the achievement of Outcomes, the ABET Committee determines and delegates any action items in response to the evaluation.
-
- o If the action item is in regards to curriculum or course work then it is delegated to the Undergraduate Curriculum Committee (UCC). For example, in Fall 2006, the ABET Committee decided that the curriculum needed additional emphasis on ethics. The UCC then proposed a new 1 credit-hour, ethics course. This was realized the next academic year by a new course EE 495 Ethics in Electrical Engineering.
- o If the action item is about faculty, space, or resources then it is delegated to the Department Chair.
- o The ABET Committee will be responsible for action items that cannot be delegated to the UCC or Department Chair.
The ABET Committee also improves the Outcomes if it finds it necessary, and designs and improves any performance measures and evaluation methods.
14) Reflect on the Assessment Process
The assessment process is undergoing changes as we get new tools on how to assess SLOs.
We are in the process of measuring SLOs using outcome achievements in undergraduate
courses. This is a three year process that was started in 2008 and will finish in 2011.