Unit: Urban & Regional Planning
Program: Urban & Regional Plan (PhD)
Degree: Doctorate
Date: Sun Nov 11, 2018 - 3:54:49 pm

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. Evaluate, synthesize and conduct independent research relevant to building knowledge in the field of urban and regional planning;

(1. Demonstrate comprehensive knowledge in one or more general subject areas related to, but not confined to, a specific area of interest., 4. Critically analyze, synthesize, and utilize information and data related to one’s field of study.)

2. Demonstrate mastery of rigorous research design and an application of research method within the field of planning; and

(2. Demonstrate understanding of research methodology and techniques specific to one’s field of study., 3. Apply research methodology and/or scholarly inquiry techniques specific to one’s field of study.)

3. Present, discuss, and defend research findings through effective oral and written communication.

(5. Proficiently communicate and disseminate information in a manner relevant to the field and intended audience.)

2) Your program's SLOs are published as follows. Please update asneeded.

Department Website URL: http://manoa.hawaii.edu/durp
Student Handbook. URL, if available online: http://manoa.hawaii.edu/durp/files/2018/01/PhD-Guidelines.pdf
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2018:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

No
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between June 1, 2015 and October 31, 2018?

Yes
No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period June 1, 2015 to October 31, 2018? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
No (skip to question 17)
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

8) Briefly explain the assessment activities that took place.

Based on a Fall 2016 review of the PhD program requirements and curriculum, faculty and doctoral students undertook formative program assessment and qualitative analysis to facilitate program improvement through productive faculty-student collaboration. The assessment coordinator organized a colloquium in which PhD candidates presented their dissertation research. Their oral presentations were evaluated by faculty using a scorecard with the PhD program student learning outcomes (SLOs). Written comments from evaluators were then coded and presented as strengths and weaknesses corresponding to each SLO component to generate a preliminary qualitative assessment. At a subsequent retreat, faculty and doctoral students reviewed the preliminary qualitative assessment along with the PhD curriculum, core course syllabi, and PhD guidelines. They identified strengths and shortcomings in the existing curriculum, mapped degree requirements and program SLOs, and proposed revisions to better align the curriculum with program SLOs. Next steps were outlined, which included increasing the number of core course credits and providing more teaching opportunities for PhD candidates through summer school and online courses.

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Six doctoral candidates presented their dissertation research at the colloquium. The sample included only those students who had advanced to candidacy.

11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other: Facilitated discussion at a retreat

13) Summarize the results of the assessment activities checked in question 7. For example, report the percentage of students who achieved each SLO.

Faculty and doctoral students (10 faculty, 5 current students, 1 recent graduate) participated in a facilitated discussion which resulted in a change in the curriculum and a revised curriculum map for the PhD program. The faculty curriculum sub-committee was tasked with developing rubrics to assess program milestones, which will be utilized for future asessment of program level learning objectives. The committee is also examining course content for one of the required methods courses to address concerns among faculty about overlap with the required research design course.

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

15) Please briefly describe how the program used the results.

Faculty and PhD students focused on two questions during the retreat. The answers from the participants are summarized below:

How can we strengthen the PhD curriculum? Should we revise existing courses or assignments? Should we add new courses?

Research design and methods requirements are weak. We recommend additional methods courses. Research design should be taught separately from research methods. Core courses need to be more consistent across instructors.

Do you think our SLOs are adequate? Should we revise any SLOs for better alignment with proposed changes (if any)?

Current SLOs are adequate. We should enforce the timely attainment of milestones.

Based on the outcomes, we did the following:

1. Increased course requirements (15 to 21 credits). The new curriculum went into effect in Fall 2017.

2. Revised the curriculum map.

3. Increased teaching opportunities for PhD candidates by offering summer courses through Outreach College.

4. Began offering an annual PhD colloquium which is mandatory for first year PhD students to attend.

5. Began developing a rubric to assess the proposal defense.

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

The collaborative nature of the assessment exercise was especially effective in catalyzing the necessary changes at the program level because we were able to facilitate dialog among faculty and students.

17) If the program did not engage in assessment activities, please justify.

N/A