Unit: Urban & Regional Planning
Program: Urban & Regional Plan (MURP)
Degree: Master's
Date: Wed Oct 07, 2015 - 12:12:45 pm

1) Below are your program's student learning outcomes (SLOs). Please update as needed.

Upon completion of the Masters in Urban and Regional Planning, students will be able to:

1)    Explain, critique and apply prominent planning theories/concepts to analyze a planning issue(s);

2)    Demonstrate an understanding of urbanization processes and rationales for planned interventions;

3)    Apply planning methods to organize, analyze, interpret and present information;

4)    Critically and creatively develop planning inquiries or processes to foster solutions-oriented decision-making;

5)    Effectively collaborate as a planning team to work with a client and/or stakeholders to assess and address a relevant planning problem to create a plan or professional report;

6)    Effectively present oral and written work (as a plan, professional report, or research paper) in a coherent, persuasive and professional manner; and

7)    Reflect upon the ethical implications of the choices planners make as professionals.

 

1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: http://www.durp.hawaii.edu/
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2015:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?

Yes
No (skip to question 16)

6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

7) Briefly explain the assessment activities that took place in the last 18 months.

Within the last 18 months, the Department has 1) updated SLOs for the MURP degree, 2) collected data to evaluate student work, and 3) updated curriculum to improve student learning and outcomes regarding the capstone.

The Department has created signature assignments to reflect the SLOs 1-3 within core courses. As part of this process it became clear that the wording within SLO’s 1 & 2 needed updating. This was done together as a faculty.

Regarding data collection, the goal at this point within the assessment process is to continue to gather data such that enough longitudinal information is available to make meaningful inferences. Data are collected from signature assignments within required courses and within an evaluation rubric used for the MA capstone project. For the signature assignments within the three core courses, faculty committees are formed to review them on an annual basis (using a scoring rubric). The faculty have also been using a rubric to assess student performance within the capstone project. This is the longest data available, ongoing from Spring 2012. Most recently the faculty have agreed to developing and using a rubric to assess elements of “effective collaboration” within the practicum course. Once this is in place, all MURP SLO’s will have an assessment tool and a system of data collection.

Regarding curriculum improvements, with feedback from students on the capstone project, the faculty decided to systematically offer a “capstone proposal and completion” course. This class helps students to focus their efforts toward their capstone project and provides peer and instructor feedback. It was offered last year as a trial and student feedback was highly positive. The Department will offer it each semester going forward. 

8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)

Direct evidence of student learning (student work products)


Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Other 1:
Other 2:

Indirect evidence of student learning


Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Other 1:
Other 2:

Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)


Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

For review of the signature assignments, a committee of three faculty members (other than the instructor) will be put together in the Spring semester. The faculty will review a random sample of student work within each of the core classes relating to the signature assignments pertaining to SLO’s 1 through 3.

The rubric for the capstone is completed by three committee members for each student graduating per semester.

10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.

The data provided in the capstone rubric offer the richest insight into the Department's current assessment activities and data collection (because it goes back the furthest). Results are provided in the following three tables. The numerical values represent a three-point scale where "1" signifies "does not meet expectations," "2" is "meets expectations," and "3" is "exceeds expectations."

Table 1. Capstone rubric results regarding critical thinking

Year of Capstone Completion

The student demonstrates working knowledge of the stated planning sub-field

The document presents a clear analysis and cohesive argument/logic

The document provides an appropriate literature review and background material

The analytical framework and methodology are appropriate and well-documented

The student organizes, interprets, and presents information well

The student demonstrates strong analytical skills and critical thinking

 

Average and Standard Deviation

2012-2013

2.8

0.4

2.7

0.5

2.7

0.5

2.7

0.5

2.7

0.5

2.7

0.5

2013-2014

2.6

0.6

2.4

0.7

2.4

0.7

2.2

0.8

2.5

0.6

2.4

0.6

2014-2015

2.8

0.4

2.5

0.6

2.7

0.5

2.6

0.5

2.7

0.5

2.7

0.5

Total

2.7

0.5

2.5

0.6

2.6

0.6

2.5

0.7

2.6

0.5

2.6

0.6

 

Table 2. Capstone rubric results regarding written communication

Year of Capstone Completion

The document is well-edited

Citations are done properly and consistently

Overall, the document is of professional quality

 

Average and Standard Deviation

2012-2013

2.6

0.5

2.7

0.5

2.7

0.5

2013-2014

2.3

0.7

2.5

0.7

2.4

0.7

2014-2015

2.6

0.5

2.8

0.4

2.8

0.5

Total

2.5

0.6

2.6

0.6

2.6

0.6

 

Table 3. Capstone rubric results regarding oral communication

Year of Capstone Completion

The important concepts from the paper are selected for presentation

The PowerPoint/visuals enhance the content

The speaker appears practiced and polished

The student interacts well with the audience during the Q&A period

The presentation is made in a clear and audible voice

Overall, the presentation is of professional quality

 

Average and Standard Deviation

2012-2013

2.8

0.4

2.8

0.4

2.8

0.5

2.8

0.4

2.9

0.3

2.8

0.4

2013-2014

2.6

0.6

2.6

0.6

2.6

0.5

2.6

0.5

2.6

0.5

2.6

0.5

2014-2015

2.8

0.4

2.7

0.6

2.7

0.6

2.8

0.4

2.8

0.4

2.8

0.4

Total

2.7

0.5

2.7

0.5

2.7

0.5

2.7

0.5

2.7

0.4

2.7

0.5

 

Looking at the data, several themes are clear. The first is that students almost always either “meet or exceed” faculty expectations for professional practice. Between 2013 and 2015, the average score is a 2.7 for the criteria “the student demonstrates working knowledge of the sub-field of planning,” with a standard deviation of 0.5. There is some variation between academic years, though a longer time trend is needed for valuable insights.

13) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

14) Please briefly describe how the program used the results.

The results of assessment data and activities are presented each semester in faculty meetings. So far the use has been mostly qualitative, in terms of catalyzing discussion to do things like 1) update SLOs and 2) update the flow of courses. The quantitative data, once there is enough of it to become particularly meaningful, will be used to improve specific areas of the curriculum. For the capstone, which has three years of data collection, it seems that the results are quite positive in terms of students meeting the expectations of the degree.

15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

As described in question 14.

16) If the program did not engage in assessment activities, please explain.