Program: Urban & Regional Plan (MURP)
Degree: Master's
Date: Wed Oct 07, 2015 - 12:12:45 pm
1) Below are your program's student learning outcomes (SLOs). Please update as needed.
Upon completion of the Masters in Urban and Regional Planning, students will be able to:
1) Explain, critique and apply prominent planning theories/concepts to analyze a planning issue(s);
2) Demonstrate an understanding of urbanization processes and rationales for planned interventions;
3) Apply planning methods to organize, analyze, interpret and present information;
4) Critically and creatively develop planning inquiries or processes to foster solutions-oriented decision-making;
5) Effectively collaborate as a planning team to work with a client and/or stakeholders to assess and address a relevant planning problem to create a plan or professional report;
6) Effectively present oral and written work (as a plan, professional report, or research paper) in a coherent, persuasive and professional manner; and
7) Reflect upon the ethical implications of the choices planners make as professionals.
1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)
2) Your program's SLOs are published as follows. Please update as needed.







3) Please review, add, replace, or delete the existing curriculum map.
- File (03/16/2020)
4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.





5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?


6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)







7) Briefly explain the assessment activities that took place in the last 18 months.
Within the last 18 months, the Department has 1) updated SLOs for the MURP degree, 2) collected data to evaluate student work, and 3) updated curriculum to improve student learning and outcomes regarding the capstone.
The Department has created signature assignments to reflect the SLOs 1-3 within core courses. As part of this process it became clear that the wording within SLO’s 1 & 2 needed updating. This was done together as a faculty.
Regarding data collection, the goal at this point within the assessment process is to continue to gather data such that enough longitudinal information is available to make meaningful inferences. Data are collected from signature assignments within required courses and within an evaluation rubric used for the MA capstone project. For the signature assignments within the three core courses, faculty committees are formed to review them on an annual basis (using a scoring rubric). The faculty have also been using a rubric to assess student performance within the capstone project. This is the longest data available, ongoing from Spring 2012. Most recently the faculty have agreed to developing and using a rubric to assess elements of “effective collaboration” within the practicum course. Once this is in place, all MURP SLO’s will have an assessment tool and a system of data collection.
Regarding curriculum improvements, with feedback from students on the capstone project, the faculty decided to systematically offer a “capstone proposal and completion” course. This class helps students to focus their efforts toward their capstone project and provides peer and instructor feedback. It was offered last year as a trial and student feedback was highly positive. The Department will offer it each semester going forward.
8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)
Direct evidence of student learning (student work products)














Indirect evidence of student learning







Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)




9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.
For review of the signature assignments, a committee of three faculty members (other than the instructor) will be put together in the Spring semester. The faculty will review a random sample of student work within each of the core classes relating to the signature assignments pertaining to SLO’s 1 through 3.
The rubric for the capstone is completed by three committee members for each student graduating per semester.
10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)










11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)







12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.
The data provided in the capstone rubric offer the richest insight into the Department's current assessment activities and data collection (because it goes back the furthest). Results are provided in the following three tables. The numerical values represent a three-point scale where "1" signifies "does not meet expectations," "2" is "meets expectations," and "3" is "exceeds expectations."
Table 1. Capstone rubric results regarding critical thinking
Year of Capstone Completion |
The student demonstrates working knowledge of the stated planning sub-field |
The document presents a clear analysis and cohesive argument/logic |
The document provides an appropriate literature review and background material |
The analytical framework and methodology are appropriate and well-documented |
The student organizes, interprets, and presents information well |
The student demonstrates strong analytical skills and critical thinking |
||||||
|
Average and Standard Deviation |
|||||||||||
2012-2013 |
2.8 |
0.4 |
2.7 |
0.5 |
2.7 |
0.5 |
2.7 |
0.5 |
2.7 |
0.5 |
2.7 |
0.5 |
2013-2014 |
2.6 |
0.6 |
2.4 |
0.7 |
2.4 |
0.7 |
2.2 |
0.8 |
2.5 |
0.6 |
2.4 |
0.6 |
2014-2015 |
2.8 |
0.4 |
2.5 |
0.6 |
2.7 |
0.5 |
2.6 |
0.5 |
2.7 |
0.5 |
2.7 |
0.5 |
Total |
2.7 |
0.5 |
2.5 |
0.6 |
2.6 |
0.6 |
2.5 |
0.7 |
2.6 |
0.5 |
2.6 |
0.6 |
Table 2. Capstone rubric results regarding written communication
Year of Capstone Completion |
The document is well-edited |
Citations are done properly and consistently |
Overall, the document is of professional quality |
|||
|
Average and Standard Deviation |
|||||
2012-2013 |
2.6 |
0.5 |
2.7 |
0.5 |
2.7 |
0.5 |
2013-2014 |
2.3 |
0.7 |
2.5 |
0.7 |
2.4 |
0.7 |
2014-2015 |
2.6 |
0.5 |
2.8 |
0.4 |
2.8 |
0.5 |
Total |
2.5 |
0.6 |
2.6 |
0.6 |
2.6 |
0.6 |
Table 3. Capstone rubric results regarding oral communication
Year of Capstone Completion |
The important concepts from the paper are selected for presentation |
The PowerPoint/visuals enhance the content |
The speaker appears practiced and polished |
The student interacts well with the audience during the Q&A period |
The presentation is made in a clear and audible voice |
Overall, the presentation is of professional quality |
||||||
|
Average and Standard Deviation |
|||||||||||
2012-2013 |
2.8 |
0.4 |
2.8 |
0.4 |
2.8 |
0.5 |
2.8 |
0.4 |
2.9 |
0.3 |
2.8 |
0.4 |
2013-2014 |
2.6 |
0.6 |
2.6 |
0.6 |
2.6 |
0.5 |
2.6 |
0.5 |
2.6 |
0.5 |
2.6 |
0.5 |
2014-2015 |
2.8 |
0.4 |
2.7 |
0.6 |
2.7 |
0.6 |
2.8 |
0.4 |
2.8 |
0.4 |
2.8 |
0.4 |
Total |
2.7 |
0.5 |
2.7 |
0.5 |
2.7 |
0.5 |
2.7 |
0.5 |
2.7 |
0.4 |
2.7 |
0.5 |
Looking at the data, several themes are clear. The first is that students almost always either “meet or exceed” faculty expectations for professional practice. Between 2013 and 2015, the average score is a 2.7 for the criteria “the student demonstrates working knowledge of the sub-field of planning,” with a standard deviation of 0.5. There is some variation between academic years, though a longer time trend is needed for valuable insights.
13) What best describes how the program used the results? (Check all that apply.)









14) Please briefly describe how the program used the results.
The results of assessment data and activities are presented each semester in faculty meetings. So far the use has been mostly qualitative, in terms of catalyzing discussion to do things like 1) update SLOs and 2) update the flow of courses. The quantitative data, once there is enough of it to become particularly meaningful, will be used to improve specific areas of the curriculum. For the capstone, which has three years of data collection, it seems that the results are quite positive in terms of students meeting the expectations of the degree.
15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.
As described in question 14.