Unit: Pacific Islands Studies
Program: Pacific Islands Studies (BA)
Degree: Bachelor's
Date: Fri Oct 11, 2013 - 3:15:49 pm

1) Below are your program's student learning outcomes (SLOs). Please update as needed.

BA in Pacific Islands Studies

1.1 Students can describe the diversity of Oceania.

1.2 Students can identify major events in the history of the region.

1.3 Students can explain indigenous issues and concerns.

1.4 Students can analyze processes of change in island societies.

1.5 Students can interact with and advocate for Pacific Island communities at home or abroad.

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL:
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online: NA
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: NA

3) Select one option:

Curriculum Map File(s) from 2013:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.


5) Did your program engage in any program assessment activities between June 1, 2012 and September 30, 2013? (e.g., establishing/revising outcomes, aligning the curriculum to outcomes, collecting evidence, interpreting evidence, using results, revising the assessment plan, creating surveys or tests, etc.)

No (skip to question 14)

6) For the period June 1, 2012 to September 30, 2013: State the assessment question(s) and/or assessment goals. Include the SLOs that were targeted, if applicable.

Based on last year’s activity and in direct response to our concerns about samples for rubrics, we focused onidentifying the best examples of ‘accomplished’, ‘competent’, developing, and ‘beginning’ at each level (100-300) for SLO3 and SLO5.

We also aimed to develop a timeline for assessing each SLO, one per year, for the BA (and MA ) program(s) and identifying in advance which specific embedded assignments (ex. essay questions, quizzes, etc.) will be collected and scanned for assessment. This has been particularly in challenging with multiple sections and instructors of PACS 108.

7) State the type(s) of evidence gathered to answer the assessment question and/or meet the assessment goals that were given in Question #6.

Over the 2012-2013 academic year, faculty were asked to collect data through embedded course assignments. These included student term papers and short essays in one 200 level and two 300 level courses, and student short answers on final exams for three sections of the 100 level course.

Each faculty member presented syllabi and course SLOs relevant to our BA curriculum map. Each member reconsidered how program outcomes were introduced, reinforced, mastered or assessed in their specific courses, and what assignments or evidence was used to determine student progress toward both course and program outcomes. Many recognized a gap between SLOs and course assessment and we collectively brainstormed methods and offered suggestions for determining student success on course SLOs. We also helped each other to identify assignments, and in some cases, suggested new assignments that would demonstrate student progress toward program SLOs.

8) State how many persons submitted evidence that was evaluated. If applicable, please include the sampling technique used.

There were approximately 5-6 essays/writing samples evaluated per course evaluated (enrollment is 20 students on average, 30-35 in 100-level courses). For the past two years most exams and students papers are scanned and archived by our faculty. Particular exam questions and written assignments that target SLO3 were identified and every third paper in each batch scan of the assignment was randomly selected for evaluation. (5 from 201, 5 from 301, 5 from 302, 15 from 108. Approximately 30 individual students’ work was evaluated for SLO3.

Due to coverage of SLO5 as described in our curriculum map, 10 individual students’ essays (in two courses, one 200 and one 300 level course) were reviewed for SLO5.

9) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)

10) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)

11) For the assessment question(s) and/or assessment goal(s) stated in Question #6:
Summarize the actual results.

We were able to rate and also rank samples of student writing relevant to SLO3. At the 100 level 25% of the students were competent, 15% were developing and 60% were beginning. We selected the best and worst samples and discussed why how these would guide our evaluations in the future for consistency. At the 200 level we found that 80% of the students were competent, 20% were developing abilities to explain indigenous issues and concerns. At the 300 level we were surprised to see that 10% of students were accomplished, 40% of students were competent, and 50% were either beginning or developing skills in explaining these issues. We wondered if we graded the upper division students more harshly or if the data itself was not as clearly linked to the SLOs as the other writing samples. We selected the best and worst samples for archiving and further discussion.

 In discussing SLO5, we reviewed 10 writing assignments regarding service learning activities. We determined that 25% of the students were competent, 25% were developing and 50% were  beginning to interact appropriately with and advocate for Pacific communities. Our discussions centered around the preparations for service learning and the high percentage of non-majors enrolled in PACS courses. We did not analyze the 2 Senior Capstones to date.

In reviewing the syllabi, many faculty acknowledged a new awareness of the need to more closely connect course assessment with course assignments. A focus on student progress toward course and program SLOs is desired and support would be helpful.

12) State how the program used the results or plans to use the results. Please be specific.

Approaches to SLO3 at the 100 and 200 level are successful in introducing these skills, but more could be done to target them. Drawing attention to the SLO more clearly at the 100 level would raise awareness as students move through the program. Many non-PACS majors enroll in our courses, so we do feel that PACS majors are more successful at explaining indigenous issues and concerns but our results reflect the general population of our classes. At the 300 level, more planning and design of the assignments collected might make comparison and evaluation easier. We had a hard time evaluating the diverse assignments and they were not as clearly linked to evaluating SLO3 as other assignments were.

In discussing SLO5, we realized that the best way of evaluating this program SLO is by closely evaluating the Senior capstone. To date, we have had two graduates, and have offered the Capstone as “reading” classes to individual students, while incorporating the capstone student in a face-to face course that enables them to discuss issues and ideas with peers. We need to design a rubric for evaluating all program SLOs through the Senior capstone.

Due to the large number of non-PACS majors served by our courses, and the small size of our courses (often due to enrollment limits in focus designated courses), it is challenging to assess program SLOs for PACS majors adequately. 

13) Beyond the results, were there additional conclusions or discoveries?
This can include insights about assessment procedures, teaching and learning, program aspects and so on.

14) If the program did not engage in assessment activities, please explain.
Or, if the program did engage in assessment activities, please add any other important information here.