Archived Date: 26 February 2024
Part 1. Data-collection Methodology: Direct and Indirect
Part 2. Benefits and Drawbacks of Data-collection Methods
Part 3. Evaluate Your Choice of Method
See also: Workshops and Events
- Take the Next Step in Program Learning Assessment: Collect & Review Evidence of Learning (2014)
- Collecting Data and Evidence of Student Learning (2009 and 2010)
- Efficient Program Assessment (2010)
- Examples of Program-level Assessment of Student Learning (2009)
Part 1. Data-collection Methodology: Direct and Indirect
Tips
- Choose methods that will
- answer specific assessment questions,
- be seen as credible to the faculty and the intended users of the results, and
- provide useful information. Quantity is not the goal.
- Use more than one method whenever possible, especially when answering questions about highly-valued learning outcomes.
- Use or modify existing evidence whenever possible. Inventory what evidence of student learning and perceptions about the program already exist. The curriculum map is a useful tool when conducting the inventory.
- Choose methods that are feasible given your program’s resources, money, and the amount of time faculty are willing to devote to assessment activities.
- Feasibility tips:
- For programs with 40 or more graduates each year, we suggest a random sample of at least 40 students. For programs with fewer than 40 graduates each year, plan on collecting evidence from 100% of the graduating students.
- If the evidence is a undergraduate student project such as a research paper and faculty other than the course instructor will evaluate the student work, keep this in mind: in our experience, it takes a faculty member an average of 15 minutes to apply a rubric to score each research paper or other significant written project. So if the program can recruit 6 faculty members to spend 90 minutes evaluating student work, that’s (a sample of) 36 students if each paper is evaluated by only one faculty member.
- Feasibility tips:
Types of Direct Data-collection Methods
DIRECT METHODS | Examples |
---|---|
Licensure or certification | Nursing program students’ pass rates on the NCLEX (Nursing) examination. |
National exams or standardized tests | a) Freshmen and seniors’ scores on the Collegiate Learning Assessment (CLA) or Collegiate Assessment of Academic Proficiency (CAAP) b) Senior-level biology students’ scores on the GRE Subject Test on Biology. |
Local exams (external to courses) | Entering students’ scores on the Mānoa Writing Placement Exam. Note: these local exams are not course exams (see “embedded testing or quizzes” for course exams/quizzes) |
Embedded testing or quizzes | a) Students’ pass rates on the German 202 final exam (students in all sections of German 202 take the same final exam). b) Two questions from the History 151 final exams are scored by a team of faculty members, and results are used for program-level assessment. |
Embedded assignments | The program selects course assignments (“signature assignments”) that can provide information on a student learning outcome. Students complete these assignments as a regular part of the course and instructors grade the assignments for the course grade. In addition, the assignments are scored using criteria or a scoring rubric and these scores are used for program-level assessment.
Examples: |
Grades calibrated to clear student learning outcome(s) | Professors give grades based on explicit criteria that are directly related to particular learning outcomes. (See also “embedded testing or quizzes” and “embedded assignments.”) |
Portfolios | A collection of student work such as written assignments, personal reflection, and self assessments. Developmental portfolios typically include work completed early, middle, and late in the students’ academic career so growth can be noted. Showcase portfolios include students’ best work and aim to show the students’ highest achievement level. |
Pre- post-tests | When used for program assessment, students take the pre-test as part of a required, introductory course. They take the post-test during their senior year, often in a required course or capstone course.
Example: Students in Speech 151 and Speech 251 take a multiple-choice test. The semester that Speech majors and minors graduate, they make an appointment to take the same test. |
Employer’s or internship supervisor’s direct evaluations of students’ performances | Evaluation or rating of student performance in a work, internship, or service-learning experience by a qualified professional. |
Observation of student performing a task | Professor or an external observer rates each students’ classroom discussion participation using an observation checklist. |
Culminating project: capstone projects, senior theses, senior exhibits, senior dance performance | Students produce a piece of work or several pieces that showcase their cumulative experiences in a program. The work(s) are evaluated by a pair of faculty members, a faculty team, or a team comprised of faculty and community members. |
Student publications or conference presentations | Students present their research to an audience outside their program. Faculty and/or external reviewers evaluate student performance. |
Description or list of what student learned | Students are asked to describe or list what they have learned. The descriptions are evaluated by faculty in the program and compared to the intended student learning outcomes.
Example: After completing a service-learning project, students are asked to describe the three most important things they learned through their participation in the project. Faculty members evaluate the descriptions in terms of how well the service-learning project contributed to the program outcomes. |
Types of Indirect Data-collection Methods
INDIRECT METHODS | Example |
---|---|
Student surveys | Students self-report via a questionnaire (online, telephone, or paper) about their ability, attitudes, and/or satisfaction. E.g., Students answer questions about their information literacy competence via an online questionnaire. |
End-of-course evaluations (e.g., CAFE) or mid-semester course evaluations | Students report their perceptions about the quality of a course, its instructor, and the classroom environment. |
Alumni surveys | Alumni report their perceptions via a questionnaire (online, telephone, or paper). E.g., alumni answer questions during a telephone survey about the importance of particular program learning outcomes and whether they are pertinent to their current career or personal life. |
Employer surveys | Potential employers complete a survey in which they indicate the job skills they perceive are important for college graduates.
Note: if the survey asks employers to directly evaluate the skills, knowledge, and values of new employees who graduated from Mānoa, the survey can be considered a direct method of evaluating students. |
Interviews | Face-to-face, one-to-one discussions or question/answer session. E.g., A trained peer interviews seniors in a program to find out what courses and assignments they valued the most (and why). |
Focus group interviews | Face-to-face, one-to-many discussions or question/answer session. E.g., A graduate student lead a focus group of 4-5 undergraduate students who were enrolled in Foundations Symbolic Reasoning courses (e.g., Math 100). The graduate student asked the undergraduates to discuss their experiences in the course, including difficulties and successes. |
Percent of time or number of hours/minutes spent on various educational experiences in and out of class | Students’ self reports or observations made by trained observers on time spent on, for example:
|
Grades given by professors that are not based on explicit criteria directly related to a learning outcome | Grade point averages or grades of students in a program.
E.g., 52% of the students in Foundations Written Communication courses received an “A,” “A+” or “A-” grade. |
Job placement data | The percent of students who found employment in a field related to the major/program within one year. |
Enrollment in higher degree programs | The number or percent of students who pursued a higher degree in the field. |
Maps or inventories of practice | A map or matrix of the required curriculum and instructional practices/signature assignments. |
Transcript analysis or course-taking patterns | The actual sequence of courses (instead of the program’s desired course sequence for students). |
Institutional/Program Research data | Information such as the following:
Specific examples: |
Part 2. Benefits and Drawbacks of Data-collection Methods
- the method’s consequences (intended and unintended),
- whether the method will be seen as credible by the faculty and intended users of the results, and
- whether faculty and users will be willing to make program changes based on the evidence the method provides.
Some methods have beneficial consequences unrelated to the results of the evaluation. For example:
- Portfolios: Keeping a portfolio can lead students to become more reflective and increase their motivation to learn.
- Embedded assignments: When faculty members collaborate to create scoring rubrics and reach consensus on what is acceptable and exemplary student work, students receive more consistent grading and feedback from professors in the program.
DIRECT METHODS | BENEFITS | DRAWBACKS |
---|---|---|
Licensure or certification |
|
|
National exam or standardized test |
|
|
Local exam (external to courses) |
|
|
Embedded testing or quiz |
|
|
Embedded assignment |
|
|
Grades calibrated to explicit student learning outcome(s) |
|
|
Portfolio |
|
|
Pre- post-test |
|
|
Employer’s or internship supervisor’s direct evaluations of students’ performances |
|
|
Observation of student performing a task |
|
|
Culminating project: capstone projects, senior theses, senior exhibits, senior dance performance |
|
|
Student publications or conference presentations |
|
|
Description or list of what student learned |
INDIRECT METHODS | BENEFITS | DRAWBACKS |
---|---|---|
Student surveys |
|
|
End-of-course evaluations (CAFE) or mid-semester course evaluations |
|
|
Alumni surveys |
|
|
Employer surveys |
|
|
Interviews |
|
|
Focus group interviews |
|
|
Percent of time or number of hours/minutes spent on various activities related to a student learning outcome |
|
|
Grades given by professors that are not based on explicit criteria directly related to a learning outcome |
|
|
Job placement data |
|
|
Enrollment in higher degree programs |
|
|
Transcript analysis or course-taking patterns |
|
|
Institutional research data |
|
Part 3. Evaluate Your Choice of Data-collection Method
- provides specific answers to the assessment question being investigated.
- is feasible to carry out given program resources and amount of time faculty members are willing to invest in assessment activities.
- has a maximum of positive effects and minimum of negative ones. The method should give faculty members, intended users, and students the right messages about what is important to learn and teach.
- provides useful, meaningful information that can be used as a basis for decision-making.
- provides results that faculty members and intended users will believe are credible.
- provides results that are actionable. Faculty members will be willing to discuss and make changes to the program (as needed) based on the results.
- takes advantage of existing products (e.g., exams or surveys the faculty/program already use) whenever possible.