Report & Plan Use of Results

Congratulations are in order if you have finished evaluating evidence of student learning or completing compiling perceptions of student learning!  Using the results for improvement is the goal of assessment. Programs use assessment results to inform decision making and to improve teaching and learning. Results can also highlight successes such as the following:

  • a better alignment of the curriculum with desired outcomes;
  • the creation of useful rubrics;
  • a set of explicit standards and corresponding samples of student work (“anchors”);
  • evidence that students are meeting or exceeding learning expectations.

Below are elements that are typically found in a report on assessment results (note: the elements may appear in a different sequence and some may not be appropriate in your report):

Elements in the report
(some may not apply)

Guiding questions
(some may not apply)
1. The learning outcome(s) and assessment question(s) that were investigated.  
2. The type of (learning) evidence that was collected and when it was collected.
  • Did students demonstrate their learning in a course assignment? A course exam? A program-level requirement such as a portfolio?
3. Description of the sampling method (if sampling was used).
  • Who submitted data?
  • How many students are in the program (or, how many graduate each year)?
  • How many were asked to participate in the study?
  • How were they selected?
  • How many actually participated?
  • How many non-responses were there?
4. The process to evaluate the evidence.
  • How was the exam scored? (include the key in the report)
  • What scoring rubric was used? (include the rubric in the report)
  • Were benchmark samples of student work used? 
5. A timeline of key events.
  • When did the following occur: select learning outcome to investigate; collect evidence of student learning; evaluate student learning?
6. Summary of the results.
  • How many students scored “1,” “2,” “3,” or “4”?
  • How many students passed the exam?
  • How many faculty members agreed, disagreed, were neutral?
7. Answer(s) to the assessment question and whether the criteria for success were met.
  • Did a sufficient percentage of students meet faculty expectations?
  • When the results are disaggregated by student characteristics of interest (e.g., ethnicity) do all groups perform at the same level
8. An explanation/interpretation of the results and specific actions under consideration (e.g., celebrate success, a change aimed at optimizing student learning)

To guide interpretation, the faculty and other stakeholders may ask questions such as the following.

Curriculum-related: Given the results, should we . . .

  • Change how courses are taught or the assignments?
  • Revise course content?
  • Widely share anchors with students?
  • Modify frequency or schedule of course offerings?
  • Revise or enforce prerequisites?
  • Revise the course sequence?
  • Add or delete course(s)?

Resource-related actions: Given the results, should we . . .

  • Hire or re-assign faculty and/or staff?
  • Increase classroom space?
  • Train faculty and/or staff?

Academic processes. Given the results, should we . . .

  • Improve how we use technology?
  • Revise advising standards or processes?
  • Revise admission criteria?
9. Evaluation of the assessment plan and changes to be made (if needed).
  • What aspects of the assessment process worked well and what changes might make it more effective?
  • Given our experience carrying out this assessment, should we:
    • Revise the program or course learning outcomes?
    • Change the criteria for success? Modify our expectations?
    • Revise data-collection or data-evaluation methods?
    • Revise measurement approaches?
    • Change the timeline?
    • Collect and analyze additional data?
10. The action(s) to be taken and a timeline of who is responsible for implementing the action(s). 
  • What specific action did the faculty agree to take to either celebrate the results to improve student learning and the curriculum?
  • What are the major steps to implement the plan?
  • Who is responsible and what is the timeline?

Tips for Writing up the Results

  • Determine the specific goal(s) of the report. For example, the goal may be to
    • communicate to colleagues in the department the student skills that were assessed, how evidence was collected and evaluated, what the results mean and how they will be used.
    • explain why a particular data-collection method was selected, how evaluation took place, and how results will be used as part of program improvement.
    • entice faculty to attend a meeting to interpret the results and decide on appropriate actions that are likely to improve student learning.
  • Attend to the level of detail: faculty in the department will appreciate more details and deans/directors/provosts will appreciate fewer details.
  • Report the results at a level of understanding appropriate for the audience receiving the report. Use language that will be understood by the individuals receiving the report. Explain technical terms. If a statistician is hired, be sure to ask him/her for a layman’s description of statistical terms.
  • Keep it short and concise–be careful not to overwhelm. If a written report is lengthy, include a 1-page executive summary.
  • Be accurate and be careful to not mislead.
  • Use visual displays, bullet lists, active voice.
  • If other assessment results exist, bring them into the discussion.

Tips

  • Distribute the report of the results as widely as possible.
  • Provide information that will assist faculty interpretation of the results.
  • Act on the results in ways that will improve the assessment process, student learning, or both. Assessment results are important evidence on which to base requests for funding, to make curriculum changes, rethink faculty lines, and more.
  • Disappointing (negative) assessment results can have a positive effect when they are used to improve the learning process.
  • Present the results in several ways: face-to-face meeting, written report, workshop format in which the report serves as the springboard for brainstorming possible next steps.
  • Engage the program faculty members, staff, and students in discussions about the results and how they might be used. Questions like these can start the conversation:
    • Do the results live up to our expectations?
    • Are our expectations appropriate? Should expectations be changed?
    • What were the most effective tools to assess student learning? Can those tools be shared and used in other courses or programs?
  • Once there is consensus on the action(s) to be taken, create an action plan that describes the actions the program will take, who will take those actions, and the timeline for implementing actions.
  • Monitor changes as they are implemented to determine whether they have the desired effect(s).

Tools and Resources

Sources consulted: