Program: Electrical Engineering (BS)
Degree: Bachelor's
Date: Tue Oct 12, 2010 - 10:17:59 am
1) Below are the program student learning outcomes submitted last year. Please add/delete/modify as needed.
All graduates of the Electrical Engineering Program are expected to have:
- Knowledge of probability and statistics, including examples relevant to Electrical Engineering (program criteria). Knowledge of mathematics through differential and integral calculus, basic sciences, and engineering sciences necessary to analyze and design complex devices and systems containing hardware and software. Knowledge of advanced mathematics, including differential equations (program criteria).
- Demonstrated an ability to design and conduct experiments, as well as to interpret data.
-
Demonstrated an ability to design a system or component that meets desired
needs within realistic constraints such as economic, environmental, social,
political, ethical, health and safety, manufacturability, and sustainability.
- Demonstrated an ability to function in a multi-disciplinary team.
- Demonstrated an ability to identify, formulate and solve electrical engineering problems.
- Understanding of professional and ethical responsibility.
- Demonstrated an ability to communicate effectively (written and oral).
- Demonstrated an understanding of the impact of engineering solutions in a global, economic, environmental, and societal context.
- Recognition of the need for life-long learning.
- Demonstrated a knowledge of contemporary issues.
- Demonstrated an ability to use the techniques, skills, and modern tools necessary for engineering practice.
2) As of last year, your program's SLOs were published as follows. Please update as needed.







3) Below is the link to your program's curriculum map (if submitted in 2009). If it has changed or if we do not have your program's curriculum map, please upload it as a PDF.
- File (03/16/2020)
4) The percentage of courses in 2009 that had course SLOs explicitly stated on the syllabus, a website, or other publicly available document is indicated below. Please update as needed.





5) State the assessment question(s) and/or goals of the assessment activity. Include the SLOs that were targeted, if applicable.
All our SLOs are assessed every semester.
The following are different methods we use to evaluate SLOs:
Industrial Advisory Board (IAB) Reports:
Course Surveys:
EE 496 Senior Capstone Design Reviews:
Measuring Outcomes Achievement in Courses:
EBI Survey:
6) State the type(s) of evidence gathered.
In the last year the main assessments conducted were through course surveys.
Course Surveys: At the end of every course, the students are surveyed about how the course helps achieve each of the Outcomes with ratings 0 (not at all), 1 (insignificant), 2 (moderate), and 3 (significant). Every year, the data is used to estimate for each Outcome the number of Contributing Courses from the Student Experience. There is a similar estimate done by the faculty from the instructor’s perspective referred to as Contributing Courses from the Faculty Expectations. The two are compared to determine the level of student achievement.
7) Who interpreted or analyzed the evidence that was collected?










8) How did they evaluate, analyze, or interpret the evidence?







9) State how many persons submitted evidence that was evaluated.
If applicable, please include the sampling technique used.
Most of the students in EE courses filled out the surveys. For most courses, faculty also submitted surveys. The assessment committee looked and evaluated the data. Here we look at some results
Our department surveys the students of all undergraduate courses every semester. The survey scores are aggregated every year to provide a macroscopic view of how our curriculum is fulfilling each Outcome. In particular, the scores are used to estimate how many courses significantly help students to achieve each Outcome. We refer to this estimate as the contributing courses and are computed as follows.
The survey asks a student to rate how the course contributes to each of the eleven Outcomes. The ratings are scores where 3 = “significant”, 2 = “moderate”, 1 = “insignificant, and 0 = “not at all”. This rating is averaged over all students per Outcome and course for the year. We next translate this average rating into a fraction F:
F = max{average rating - 1, 0}/2,
which is an estimate of the fraction of the course that contributes significantly to the Outcome. Note that if the average rating = 3 (“significant”) then F = 1, which implies that the entire course contributes significantly to the Outcome. If the average rating = 2 (“moderate”) then F = 0.5, which implies that half the course contributes significantly to the Outcome. If the average rating = 1 or 0 then F = 0, which means the course does not contribute significantly to the Outcome.
Then an estimate of the number of credits of the course that contributes significantly to the Outcome is
Credit Hours * F.
We refer to this product as the contributing credits for the Outcome. After computing the contributing credits for all courses in an academic year, we summed these credits per Outcome as follows. The curriculum is divided into sets of courses as described in Section VI--Criterion 5:
· Engineering Required, which is required of all EE students. However, EE 296, 396, and 496 were not counted here. EE 296 and 396 was not included since the experience of students can differ quite a bit. As a result, we decided to be conservative and not add them to contributing courses. EE 496 projects was not included since they are being measured directly by faculty through reviews of project reports.
· Track Group I, which is required of all Track students
Thus, the sum of contributing credits may be different for each Track.
To make better sense of these contributing credits, we scale these credits to another metric, which refer to as contributing courses as experienced by our students. Since a regular course is 3 credit hours, we define
contributing courses = contributing credits/3.
Each of the three tracks (Systems, Computer Engineering, and Electrophysics) have about the same performance. We did have some occasional data collection problems so that we had missing data. Since this missing data does not add to the contributing courses, our estimates are conservative.
10) Summarize the actual results.
We note that technical Outcomes, and in particular Outcomes 1 (math and basic science), 2 (experimental), 3 (design), 5 (EE problems), and 11 (design tools), have a high number of contributing courses. They were between 6 and 9 contributing courses. The other Outcomes have about 3 contributing courses. Thus, the students seem to have enough experience to achieve the Outcomes. Perhaps more material should be presented so that soft outcomes can be improved.
11) How did your program use the results? --or-- Explain planned use of results.
Please be specific.
Once we receive data. the Chair of the department and assessment committee look at ways we can first
interpret data and then if we see weaknesses, ways we can improve the program. We have not acted
specifically on the course surveys, but if we see the same results in some of the direct assessments
we will look to possibly modifying the curriculum and possibly changing specific courses so that
improvements can be made in outcomes that are not strong.
12) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, program aspects and so on.
13) Other important information:
The EE department was accredited by ABET for six years after the ABET evaluators made an on-site visit
to evaluate our program in Nov, 2009.