Unit: Information & Computer Science
Program: Computer Science (BS)
Degree: Bachelor's
Date: Mon Nov 05, 2018 - 11:34:29 am

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. Students can apply knowledge of computing and mathematics appropriate to the discipline

(1a. General education, 1b. Specialized study in an academic field, 2a. Think critically and creatively, 2c. Communicate and report)

2. Students can analyze a problem, and identify and define the computing requirements appropriate to its solution

(1a. General education, 1b. Specialized study in an academic field, 2a. Think critically and creatively, 2c. Communicate and report)

3. Students can design, implement, and evaluate a computer-based system, process, component, or program to meet desired needs

(1a. General education, 1b. Specialized study in an academic field, 2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report)

4. Students can function effectively on teams to accomplish a common goal

(1a. General education, 1b. Specialized study in an academic field, 2a. Think critically and creatively, 2c. Communicate and report, 3a. Continuous learning and personal growth, 3d. Civic participation)

5. Students have an understanding of professional, ethical, legal, security and social issues and responsibilities

(1a. General education, 1b. Specialized study in an academic field, 2a. Think critically and creatively, 2c. Communicate and report, 3b. Respect for people and cultures, in particular Hawaiian culture)

6. Students can communicate effectively with a range of audiences

(1a. General education, 1b. Specialized study in an academic field, 2a. Think critically and creatively, 2c. Communicate and report)

7. Students can analyze the local and global impact of computing on individuals, organizations, and society

(1a. General education, 1b. Specialized study in an academic field, 2a. Think critically and creatively, 2c. Communicate and report, 3b. Respect for people and cultures, in particular Hawaiian culture)

8. Students can recognize the need for and an ability to engage in continuing professional development

(1a. General education, 1b. Specialized study in an academic field, 2a. Think critically and creatively, 3a. Continuous learning and personal growth)

9. Students can use current techniques, skills, and tools necessary for computing practice.

(1a. General education, 1b. Specialized study in an academic field, 2a. Think critically and creatively, 2c. Communicate and report)

10. An ability to use and apply current technical concepts and practices in the core informationtechnologies. [BA IT only]

11. An ability to identify and analyze user needs and take them into account in the selection,creation, evaluation and administration of computer-based systems. [BA IT only]

12. An ability to effectively integrate IT-based solutions into the user environment. [BA IT only]

13. An understanding of best practices and standards and their application. [BA IT only]

14. An ability to assist in the creation of an effective project plan. [BA IT only]

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: http://www.ics.hawaii.edu/academics/undergraduate-degree-programs/bs-ics/
Student Handbook. URL, if available online: http://www.ics.hawaii.edu/wp-content/uploads/2015/08/ics-academic-plan-2015.pdf
Information Sheet, Flyer, or Brochure URL, if available online: http://www.ics.hawaii.edu/wp-content/uploads/2015/08/ics-academic-plan-2015.pdf
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: http://courses.ics.hawaii.edu/syllabuses/
Other:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2018:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

No
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between June 1, 2015 and October 31, 2018?

Yes
No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period June 1, 2015 to October 31, 2018? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
No (skip to question 17)
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

8) Briefly explain the assessment activities that took place.

The ICS Department recently moved towards preparing for ABET accreditation. Professor Casanova is currently leading assessment efforts by developing a Web application, DataBET, to collect data for accreditation based on the student objectives. Data is submitted by faculty teaching ABET accreditation line courses which includes performance on objectives, sample student work for poor, mid-range, and good examples, and a section for additional comments.

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

11 faculty submitted data to DataBET

11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

13) Summarize the results of the assessment activities checked in question 7. For example, report the percentage of students who achieved each SLO.

The DataBET Web app includes the ability to collect data for each of the student objectives based on the ABET accreditation student objectives. DataBET makes it possible for faculty to upload their student achievement scores and sample work directly to the system at the end of the term. Since we included a wide range of data, a summary of data collected for our introduction to programming course (ICS 111) is included below during the assessment timeframe. Data collected and faculty feedback was used to inform course sequencing. For example, Calculus II was moved to be a pre-requisite to Algorithms to improve mathematical maturity earlier in the students’ coursework.

ICS 111 Sample Data

Spring 2016
SO1: Good: 23, average: 66, approaching: 26 [average score 66.8%]
SO2: Good: 32, average: 42, approaching: 33 [average score 64.8%]

Fall 2016
SO1: Good: 74, average: 30, approaching: 53 [average score 70.8%]
SO2: Good: 107, average: 11, approaching: 39 [average score 77.8%]

Spring 2017
SO1: Good: 51, average: 36, approaching: 40 [average score 70.6%]
SO2: Good: 54, average: 12, approaching: 61 [average score 61.4%]
SO3: Good: 74, average: 18, approaching: 35 [average score 71.3%]

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other: Modification of assessment structure to meet ABET requirements.

15) Please briefly describe how the program used the results.

The department was able to use the results to re-align course content to ABET accreditation standards. Faculty were also able to identify different areas for student development and include additional peer mentoring in appropriate lower level courses. The program was able to also make modifications to the course sequencing to improve student success in subsequent courses. However, we did not have the opportunity to assess this change.

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

The program was able to learn about the accreditation process by ABET. This has been difficul since the target is moving and acquiring the appropriate data for a moving target is a consistent challenge. This includes changes to student objective requirements including additional projects and math credit hours.

17) If the program did not engage in assessment activities, please justify.

N/A