Unit: Political Science
Program: Political Science (BA)
Degree: Bachelor's
Date: Mon Oct 21, 2013 - 6:56:11 am

1) Below are your program's student learning outcomes (SLOs). Please update as needed.

1.     Students will be able to think critically and historically about power and the political. Students identify and analyze power dynamics in a range of social contexts and processes, including but not limited to language, government, images of the future and civil society institutions. Students will be able to pose and explore relevant, open-ended questions about authority and legitimacy.

2.     Students will be able to craft and defend evidence-based arguments. This argumentative capacity is built upon their ability to rigorously and respectfully weigh competing views, synthesize multiple sources and critically reflect on their own and others assumptions. Students should be able to make arguments in both written and oral forms of communication.

3.     Students will be able to communicate effectively in public settings, with attention to and appreciation of diverse cultural contexts. Students are equipped for productive, civic participation in their communities, able to synthesize critical thinking, empathic, collaborative and argumentative capacities, and futures thinking with an audience in mind.

4.     Students will be able to cogently explain the interconnectedness of local and global dynamics of power within the context of the political and cultural specificities of Hawai`i nei.

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: http://www.politicalscience.hawaii.edu/undergraduate-program.html
Student Handbook. URL, if available online: NA
Information Sheet, Flyer, or Brochure URL, if available online: NA
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:

3) Select one option:

Curriculum Map File(s) from 2013:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.


5) Did your program engage in any program assessment activities between June 1, 2012 and September 30, 2013? (e.g., establishing/revising outcomes, aligning the curriculum to outcomes, collecting evidence, interpreting evidence, using results, revising the assessment plan, creating surveys or tests, etc.)

No (skip to question 14)

6) For the period June 1, 2012 to September 30, 2013: State the assessment question(s) and/or assessment goals. Include the SLOs that were targeted, if applicable.

Last year, 2012-2013, we piloted an undergraduate survey on SurveyShare, got feedback from the Manoa Assessment Office on it and then revised. We also moved the survey this fall over to Qualtrics. Currently we are in the process of drafting items for the survey that would provide direct evidence, rather than just indirect, on student acheivement of our new departmental SLOs. 

This summer, the undergraduate adviser attended the Manoa Assessment Officeʻs workshop. At this time, the departmentʻs SLOs were revisited and suggested revisions were drafted. At the start of the fall semester, the UG curriculum committee for our department reviewed and suggested further revisions. At the end of September, our full department reviewed the new SLOS and after a week of discussion and further revision, the department approved the SLOs that are included in this report.

We are now in the process of devising further assessment activities aligned with the new SLOs. As mentioned above, our first step is to draft questions for each SLO that can be added to our UG entrance and exit survey. These questions would ask students to take a look at various political texts (cartoons, videos, etc) and write a short respond to the prompt. This data can be used as a snapshot--at the beginning and at the end of a studentʻs journey through the major--to look at progression on the SLOs.

Our next goal for the fall and spring is to work on a more cohesive assessment system for our capstones. Several of the faculty who advise the capstones are already doing their own assessments, but we need to coordinate and gather the information for the program level. If the Assessment Office has any samples that we could build off of for internship and teaching assistantship capstones, that would be really helpful.

We have also proposed a new structure for our 110 and 305/315 core courses, which the department is considering right now. If approved, we think this will make it easier to coordinate assessment at the early levels of our core curriculum because we will have at least one large course (100 students) with centralized supervision of the TAs who run the discussion sections.

7) State the type(s) of evidence gathered to answer the assessment question and/or meet the assessment goals that were given in Question #6.

We are still in the design phase.

8) State how many persons submitted evidence that was evaluated. If applicable, please include the sampling technique used.

One of the things that we have been doing is putting a call out to all faculty and graduate students who teach courses that are part of the core for our major. We have asked folks to submit papers or other student work that are exemplars of excellent, just passing and unacceptable work for the class. We created a DropBox to archive these exemplars, and we have had about a 40% response from the instructors. As we continue to build this archive, the papers can then be used to create rubrics or other program-level assessment tools.

9) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)

10) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)

11) For the assessment question(s) and/or assessment goal(s) stated in Question #6:
Summarize the actual results.

We have not reached this stage yet.

12) State how the program used the results or plans to use the results. Please be specific.

13) Beyond the results, were there additional conclusions or discoveries?
This can include insights about assessment procedures, teaching and learning, program aspects and so on.

14) If the program did not engage in assessment activities, please explain.
Or, if the program did engage in assessment activities, please add any other important information here.

We are making some gains. It can be frustrating for the assessment coordinator at times. The current structure is that the undergraduate advisor (who advises our 250 UG majors) is also chairing the UG curriculum and assessment committee. This is simply way too much to handle with no course release or overload. Ideally, the advising should be split from the assessment coordination so that they are two separate positions. We also need to have someone hired during the summer to help with analyzing the data that we gather throughout the year.