Unit: Political Science
Program: Political Science (PhD)
Degree: Doctorate
Date: Mon Nov 16, 2015 - 4:01:51 pm

1) Below are your program's student learning outcomes (SLOs). Please update as needed.

We assume students who enter graduate level study have been given appropriate training in the fundamentals of the discipline and possess the qualities necessary to produce graduate-level work. From the admissions process on, students are assessed upon several important outcomes.

1. The ability to produce quality scholarship.

At the graduate level we anticipate that students will use their knowledge of the fundamentals of the discipline as well as the critical evolution of the discipline over time to help contribute to that field through their own research.

2. Mastery of one or more of the sub-fields offered in the major.

Our program offers subfields that form the specialization a graduate student will develop while enrolled in the program. We expect students graduating from the program to have mastered one or more of these subfields. Specifically, they should have an understanding of the traditional and critical literature of the subfield and be able to demonstrate a mastery of these fields.

3. Ability to think politically. Much like our expectations of the undergraduate majors, we require students to think politically about social phenomenon. Comprehending that all social, economic, and cultural processes are also political is a crucial learning outcome. That comprehension creates knowledgeable citizenry capable of acting on policy decisions and conduct. That no knowledge is innocent, but that all knowledge has consequences is key to this learning outcome.

1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: http://www.politicalscience.hawaii.edu/graduate-program.html
Student Handbook. URL, if available online: http://www.politicalscience.hawaii.edu/graduate-program.html
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2014:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?

SKIP

6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

7) Briefly explain the assessment activities that took place in the last 18 months.

8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)

Direct evidence of student learning (student work products)


Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Other 1:
Other 2:

Indirect evidence of student learning


Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Other 1:
Other 2:

Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)


Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.

13) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

14) Please briefly describe how the program used the results.

15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

16) If the program did not engage in assessment activities, please explain.

At the doctorate level, we have instituted a number of important changes that have visibly impacted our program.

First, we have revamped our advising process so that each incoming PhD student is assigned a faculty advisor from the very ouset. Students work closely with their advisers through the period of course-work, proposal, comprehensives and advancement to candidacy. Advisers are required to meet with their students at least once each semester and they are required to file an annual report on the progress of their advisees each year with the graduate chair. If a student is *not* making satisfactory progress, their advisers bring it to the attention of the grad chair who then discusses such "unsatisfactory" cases with the rest of the graduate faculty in the department. A determination is made as to whether the student is to be placed on "probation" and this is communicated to the Office of Graduate Education. Revamping the advising process, and ensuring the continuous mentoring of our doctoral students is the single most mportant assessment activity that we have embarked on in recent years. 

Second, we have worked hard to get our doctoral students who have been in the program a long time to either finish their dissertations or to consider dropping out of the program. This proactive endeavor has had good results as the number of PhD students in our deparment has more than halved in the last 6-8 year period. Today, we have just about 70 PhD students enrolled in our program compared with twice that around 6 years ago. More importantly, a siginficant number of our ABDs who had been unable to complete their dissertations did so in recent years as a result of our committed policy in this regard. 

Third, we have (in accordance with the changing employment conditions for PhDs in the social sciences in recent years) drastically reduced the size of our incoming PhD cohorts each year and have enacted a "cap" whereby we will not offer admission to more than 20 applicants - with the exectation that about half as many (10) will choose to join us. 

ts difficult to conceive of a graduate program that does *not* engage in assessment activities. Every graduate student receives comments on their papers; their papers and assignments are graded; they are evaluated every time they participate in class or in departmental events.Faculty are constantly required to write reference letters for them - that is a part of student assessment. I could go on for pages - but assessing student performance is as integral to the life of a department as breathing. So, unless you want me to construct a Borgesian map that details every living moment of our day and night, I suggest you take my word that yes, we do engage in assessment of our student scholarship and progress. How that meshes - or not - with Student Learning Outcomes and rubrics and all the rest of that is something I choose not to explore at this point.