Unit: Office of the Vice Chancellor for Academic Affairs
Program: A1_Assessment_ProgExample
Degree: NA
Date: Thu Jun 30, 2011 - 2:24:10 pm
Program: A1_Assessment_ProgExample
Degree: NA
Date: Thu Jun 30, 2011 - 2:24:10 pm
1) Below are the program student learning outcomes submitted last year. Please add/delete/modify as needed.
Recommended format: "Students [can or will] ["action verb"] [learning statement].
Example: Students can compare and contrast major economic theories.
2) As of last year, your program's SLOs were published as follows. Please update as needed.







3) Below is the link to your program's curriculum map (if submitted in 2009). If it has changed or if we do not have your program's curriculum map, please upload it as a PDF.
Curriculum Map File(s) from 2010:
- File (03/16/2020)
4) The percentage of courses in 2009 that had course SLOs explicitly stated on the syllabus, a website, or other publicly available document is indicated below. Please update as needed.





5) State the assessment question(s) and/or goals of the assessment activity. Include the SLOs that were targeted, if applicable.
Because assessment of student learning is broad and multi-faceted, it includes asking and answering questions such as these:
- How well are our majors achieving our program outcomes?
- Which course syllabi address our program outcomes?
- What do students think about overall program quality?
- How can we use the results from a previous assessment project?
- Are students prepared when they enter our 300-level courses? What are their grades in the pre-requisite courses?
- What types of jobs do our majors expect to get after they graduate?
- Should we revise our program outcomes?
- Are our courses aligned with program outcomes? Do students have sufficient opportunities to practice lab techniques so they are more likely to meet our expectations?
- What rubric will work to evaluate student projects?
- What examples of student work can serve as “anchors” or “benchmarks” for high, average, and unacceptable quality?
6) State the type(s) of evidence gathered.
Typical types of direct evidence:
Thesis/ Dissertation (undergrad & grad) |
Oral: Performance/ Present/ Exam/ Defense |
Embedded Assignment |
Exam: Qualifying/ Comprehensive (graduate level) |
Exhibition/ Performance / Project |
Applied Experience Outside the Classroom |
Exam: Embedded |
Exam: Exit/ Program/ Prof/ Nat'l |
Capstone |
Portfolio |
Publications (local/ national) |
Enrty-level Diagnostics |
IRB Approval of Research |
Post-grad Data |
Student Interview/ Focus Grp |
Student Survey: Program |
Course Grade(s) |
Course Survey (e.g., CAFÉ) |
Student Survey: Other |
Employer Survey/ Interview |
Alumni Survey |
Student Self-assessment/ Reflection |
Syllabi Review |
7) Who interpreted or analyzed the evidence that was collected?










8) How did they evaluate, analyze, or interpret the evidence?







9) State how many persons submitted evidence that was evaluated.
If applicable, please include the sampling technique used.
Sampling technique such as random sample.
10) Summarize the actual results.
Brief summary is sufficient.
11) How did your program use the results? --or-- Explain planned use of results.
Please be specific.
Specific details are appreciated.
12) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, program aspects and so on.
Insights about assessment process can help others.
13) Other important information:
If the program did not engage in assessment, please explain here.
If the program has created an assessment plan, please give an overview here.