Unit: Office of the Vice Chancellor for Academic Affairs
Program: A1_Assessment_ProgExample
Date: Fri Aug 23, 2013 - 9:26:16 am

1) Below are your program's student outcomes (SOs). Please add or update as needed.

Because the AO does not work with students or offer an academic degree, it has program outcomes.

  1. The AO has in place an infrastructure to sustain a culture of assessment.
  2. Faculty members are aware of opportunities to publish/present on the scholarship of teaching and learning in their field.
  3. Academic degree programs complete the assessment cycle, which includes faculty members using assessment results to improve student learning.
  4. Department leaders and administrators use student learning assessment to guide planning.
  5. The campus community (faculty members, administrators, staff, students) perceives program-level assessment as supporting student learning.

2) Your program's SOs are published as follows. Please update as needed.

Program's Website. URL: http://manoa.hawaii.edu/assessment
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure. URL, if available online:
UHM Catalog. Page Number:
Other:
Other:

3) Provide the program's activity map or other graphic that illustrates how program activities/services align with program student outcomes. Please upload it as a PDF.

Activity Map File(s) from 2013:

4) Did your program engage in any program assessment activities between June 1, 2012 and September 30, 2013? (e.g., establishing/revising outcomes, aligning activities to outcomes, collecting evidence, interpreting evidence, using results, revising the assessment plan, creating surveys, etc.)

Yes
No (skip to question 14)

5) For the period June 1, 2012 to September 30, 2013: State the assessment question(s) and/or assessment goals. Include the student outcomes that were targeted, if applicable.

How well did the AO meet its outcomes this academic year?

6) State the type(s) of evidence gathered to answer the assessment question and/or meet the assessment goals that were given in Question #5.

Surveys - open and closed-ended question - were distributed during or after every AO event/workshop.

7) State how many persons submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Out of 259 attendees, 249 (96%) submitted a survey.

8) Who interpreted or analyzed the evidence that was collected? Check all that apply.

Program faculty/staff member(s)
Faculty/staff committee
Ad hoc faculty/staff group
Director or department chairperson
Persons or organization outside the university
Students (graduate or undergraduate)
Dean or Associate Dean
Advisory Board
Other: AO faculty specialists

9) How did he/she/they evaluate, analyze, or interpret the evidence? Check all that apply.

Compiled survey results
Used quantitative methods on student data (e.g., grades, participation rates) or other numeric data
Used qualitative methods on interview, focus group, or other open-ended response data
Scored exams/tests/quizzes
Used a rubric or scoring guide
Used professional judgment (no rubric or scoring guide used)
External organization/person analyzed data (e.g., Social Science Research Institute)
Other:

10) For the assessment questions/goals stated in Question #5, summarize the actual results.

86% of respondents met the learning outcomes.

97% reported the event useful and 91% reported they will apply what was learned to future assessment tasks.

11) What was learned from the results?

Overall, the events and workshops are functioning well. Surveys allow attendees to respond anonymously and allow them to give constructive feedback.

12) State how the program used the results or plans to use the results. Please be specific.

No major changes were made. Minor changes to event content were made based on specific suggestions. When less than 90% met the learning outcome for a particular event, that event was modified for content or presentation technique.

13) Reflect on the assessment process. Is there anything related to assessment procedures your program would do differently next time? What went well?

Everything went well. Distributing surveys during or after the session did not seem to make a difference in response rates.

14) If the program did not engage in assessment activities, please explain.
Or, if the program did engage in assessment activities, please add any other important information here.