Program: Civil Engineering (BS)
Date: Wed Oct 10, 2012 - 9:39:20 am
1) Below are your program's student learning outcomes (SLOs). Please update as needed.
The student learning outcomes (SLOs), also known as program outcomes, describe a skill set that students are expected to have at the time of graduation. The SLOs are:
a. an ability to apply knowledge of mathematics, science, and
b. an ability to design and conduct experiments, as well as to analyze
and interpret data
c. an ability to design a system, component, or process to meet
desired needs within realistic constraints such as economic,
environmental, social, political, ethical, health and safety,
manufacturability and sustainability
d. an ability to function on multi-disciplinary teams
e. an ability to identify, formulate, and solve engineering problems
f. an understanding of professional and ethical responsibility
g. an ability to communicate effectively
h. the broad education necessary to understand the impact of engineering
solutions in a global, economic, societal, and environmental context
i. a recognition of the need for, and an ability to engage in, life-long
j. a knowledge of contemporary issues
k. an ability to use the techniques, skills, and modern engineering
tools necessary for engineering practice, particularly recognizing the
integral role of computers in engineering and the rapid expansion of
resources on the internet.
2) Your program's SLOs are published as follows. Please update as needed.
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number: NA
Course Syllabi. URL, if available online:
3) Select one option:
- File (03/16/2020)
4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.
5) Did your program engage in any program assessment activities between June 1, 2011 and September 30, 2012? (e.g., establishing/revising outcomes, aligning the curriculum to outcomes, collecting evidence, interpreting evidence, using results, revising the assessment plan, creating surveys or tests, etc.)
No (skip to question 14)
6) For the period June 1, 2011 to September 30, 2012: State the assessment question(s) and/or assessment goals. Include the SLOs that were targeted, if applicable.
Please refer to Tables in Question 3 containing our schedule of assessment. There are some who have been delinquent but we will eventually catch up.
7) State the type(s) of evidence gathered to answer the assessment question and/or meet the assessment goals that were given in Question #6.
See table in Question 3. In general, we use performance appraisal, design portfolio and F.E. exam.
8) State how many persons submitted evidence that was evaluated. If applicable, please include the sampling technique used.
Again, please refer to Tables in Question 3. We try to promote participation from every single faculty. Whether we are successful or not is another issue.
9) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)
Ad hoc faculty group
Persons or organization outside the university
Advisors (in student support services)
Students (graduate or undergraduate)
10) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
11) For the assessment question(s) and/or assessment goal(s) stated in Question #6:
Summarize the actual results.
Last year was a slow year in terms of assessment. Data was collected for many outcomes but only for two outcomes were they analyzed and documented. We will step it up in the coming years.
12) State how the program used the results or plans to use the results. Please be specific.
The results will be used for program improvements. If we identify something that we can change to improve on the assessment results, we will make those changes.
13) Beyond the results, were there additional conclusions or discoveries?
This can include insights about assessment procedures, teaching and learning, program aspects and so on.
As indicated previously, last year was a lull in our assessment effort. Not much was learnt since only 2 of the 11 outcomes were assessed. These two outcome assessment results showed everything was fine and dandy.
14) If the program did not engage in assessment activities, please explain.
Or, if the program did engage in assessment activities, please add any other important information here.
We had two changes in leadership in our assessment committee in the last 3 years. Many of the assessment committee members have been working on these things for the last two ABET cycles and there is possibly some degree of burnout in the existing committee members coupled with tons of research and teaching responsibilities. We are in need of ideas on how to pass the torch to other faculty who may or may not be into assessment. If Monica and/or Marlene are able to help us motivate other faculty, that would be great.