The basic steps:
- Define the program goals and/or mission.
- Establish student learning objectives/outcomes (SLOs).
- Determine “learning opportunities” (i.e., where the learning will take place)
- Undertake an assessment process: establish a research question/goal; collect and evaluate evidence (direct or indirect evidence of student learning); evaluate, analyze, and interpret evidence.
- Based on the results, create and implement an action plan to improve the program and student learning (or a plan to celebrate success).
Steps 1-3 are typically done once and then revisited as needed. Steps 4 and 5 are repeated each time an assessment activity/process takes place.
Please read the detailed information below.
Helpful guides and “big picture” examples
Two helpful guides/workbooks with step-by-step information:
A. Program-Based Review and Assessment: Tools and Techniques for Program Improvement. Office of Academic Planning and Assessment. University of Massachusetts Amherst. (2001)
B. Tools and Techniques for Program Improvement: Handbook for Program Review & Assessment of Student Learning. Office of Institutional Assessment, Research, and Testing. Western Washington University. (2006)
Assessment and Curriculum Support Center workshop slides and handouts (PDF)
F. Assessment Planning for Busy People (2015)
Create a Department/Program Assessment Plan
A written assessment plan that can be distributed within or outside the department/program is useful. Below are elements of a good plan, guiding questions, and tips/notes.
- Download or save a copy of this planning template to create the department’s/program’s assessment plan.
- Sample of a completed template (Foundations Written Communication).
- Evaluate your plan using a framework (scroll to the bottom of the planning template).
|Elements of a Program Assessment Plan||Guiding Questions||Tips & Notes|
|Program Mission Statement and/or Goals||
|Program Learning Objectives/Outcomes (intended outcomes)
Also known as “SLOs”
Workshop Presentation Slides and Handouts (PDFs)
Workshop Presentation Slides and Handouts (PDFs)
|Long-range Timeline and Lead People for Each Assessment Activity||
Plan an Assessment Activity/Project
After the program creates its master assessment plan (including the elements above), the program starts evaluating how well students are meeting the desired SLOs. Some programs choose to evaluate one outcome per year; other programs tackle multiple outcomes at the same time. Use the following information to design a meaningful, valuable assessment activity/project.
- Download or make a copy of this template to plan an assessment project
- Example of a completed template (Foundations Written Communication) [PDF]
|1. Assessment research questions (or goals for assessment activities)||
Below are examples of assessment research questions:
|2. Assessment methods and timelines||
||See also “Choose a Method to Collect Data/Evidence.”|
|4. Decisions, plans, and recommendations||
Characteristics of Good Assessment Planning
- Focuses on the program (e.g., the major) rather than individual courses
- Has 2-6 goals and 3-8 student learning outcomes
- Anticipates how the results will be used for improvement and decision making
- Collaboratively created with input and discussion by the entire department
- Is systematic
- Is manageable
- Over time, multiple data-collection methods are used
- Is conveyed to students and the students understand their role in assessment
- Leads to improvement
- Has a foundation in Mānoa’s mission and goals and undergraduate learning objectives (if applicable).
- Includes an evaluation of the assessment
- Describes the goals(s) of each planned assessment project
Develop a Comprehensive Assessment Plan
We recommend starting with a small assessment project. Your program can have a comprehensive assessment plan that is slowly implemented. Here are some techniques to design a comprehensive plan:
- Create a multi-year plan in which 1 or 2 SLOs are assessed each year.
- Use evidence from a capstone experience to simultaneously evaluate multiple SLOs.
- Plan a direct and an indirect data collection method for each outcome. Examples:
- An exit survey given every other year asks students to self-report on all outcomes while direct evidence of student learning is evaluated for each outcome on an annual rotation (one outcome per year).
- Student projects in a capstone course are designed to provide evidence for several outcomes. In addition, indirect evidence in the form of an alumni survey and job placement figures are used to triangulate the conclusions reached through analysis of the capstone course results.
- Use a pre-test/post-test design to gather evidence on possible growth (“value-added”) from freshman to senior year.
Last reviewed/updated: 09/13/2022
Thank you to the “Program-based Review and Assessment,” Academic Planning & Assessment, Univ. of Massachusetts Amherst and the San Diego State University Committee on Assessment for information that helped guide this page.