Develop a Capstone

The first half of this document defines and discusses the capstone experience. The latter half covers using capstones for program assessment.

The Capstone Experience

Definition

The capstone experience is a culminating set of experiences that “captivate, encapsulate, synthesize, and demonstrate learning.” [1]

Keys to the Capstone

  1. The capstone should be a culminating set of personal, academic, and professional experiences.
    • In a capstone course, students synthesize, integrate, and/or apply their previous knowledge, rather than acquire new knowledge or skills. Students demonstrate mastery, not learn new knowledge/skills.
    • A capstone should occur near the end of the program. [Tip: schedule the capstone course before the student’s last semester in case remediation is needed.]
    • Student ownership, responsibility, and engagement should be central to the capstone.
  2. Rationale for the framework (see below) should be based on the specific needs of the program/discipline.
  3. The products (e.g., written assignments) of the capstone should be designed to help assess the program’s desired student learning outcomes.
  4. Discussion, reflection, and/or demonstration of general education and/or institutional outcomes should be evident in the capstone. [Note: some general education outcomes may not be relevant, but a capstone experience can likely address these general education outcomes: effective written and oral communication, ethical decision making, information accessing and information processing, problem solving, inquiry and analysis methods.]
  5. Satisfactory completion of the capstone experience should be required for graduation.
  6. Full-time (tenured) faculty members should facilitate, mentor, and/or coordinate the capstone experience.

Frameworks for a Capstone Experience

There are four common frameworks for capstones (see Rowles, et al.). Programs typically choose one as the primary framework based on their program’s needs. If/when appropriate, the other frameworks may also be incorporated or acknowledged.

  1. Mountaintop. Students from two or more disciplines (or specializations) engage in interdisciplinary inquiry. For example: Geography majors and Biology majors enroll in their major’s capstone courses and are paired with a student from the other discipline. Each GEOG-BIOL pair of students completes an interdisciplinary project such as a project that uses geographic information systems (GIS) to monitor fish migration patterns or habitat changes.
  2. Magnet. Students pull together their learning from multiple courses and/or experiences. For example, students gather their best work samples from four courses (can also include internship, practicum, service learning, etc.), choosing samples that directly address the program’s learning outcomes.
  3. Mandate. Students document their learning in relation to external industry/professional standards or requirements. For example, civil engineering students gather evidence to demonstrate they have achieved the outcomes set forth by the American Society of Civil Engineers.
  4. Mirror. Students reflect on their experiences and metacognitive skills in relation to program goals and outcomes. For example, students write short reflective pieces that describe what they have learned and how their assignments and experiences have helped them achieve each program outcome.

Options for Courses/Activities within the Capstone Experience

A capstone experience can consist of one or a combination of these:

  • A course in the major
  • An interdisciplinary course with a minimum of two distinctly different disciplines represented
  • An out-of-class/co-curricular experience
  • A service- and/or community-based learning experience
  • An application/demonstration of knowledge (e.g., thesis, design project, portfolio development)
  • A college-to-work/career transition experiences (e.g., internship, informational interviewing)

Pedagogic Practices for Capstone Experiences

Professors typically use some of the following teaching strategies and methods in capstone experiences:

  • Collaborative learning

Collaborative learning is an umbrella term for a variety of educational approaches involving joint intellectual effort by students, or students and teachers together. Usually, students are working in groups of two or more, mutually searching for understanding, solutions, or meanings, or creating a product. . . Most center on students’ exploration or application of course material, not simply the teacher’s presentation or explication of it. Collaborative learning represents a significant shift away from the typical teacher-centered or lecture-centered milieu in college classrooms.” Collaborative Learning: A Sourcebook for Higher Education (1992) by Anne S. Goodsell, et al., National Center on Postsecondary Teaching. Available thru interlibrary loan from UH Hilo, LB1032.C65.1992.

  • Self-directed learning

Faculty members give students choices about their learning as well as responsibility for the consequences associated with those choices. The faculty member (or internship supervisor, co-op employer, etc.) establishes the necessary structures to guide and support students while still leaving the students to do such things as establish goals, create timelines, monitor progress, develop products for evaluation, etc.

  • Problem-based learning

Faculty members give students an ill-defined task to complete or an open-ended problem to solve. The faculty member acts as a mentor, coach, and/or facilitator. Often the task/problem mirrors an actual, discipline-based task/problem but it has been simplified or structured to match the level of the students.

  • Learner-centered (Learner-centered = a focus on what the students are learning and doing, not on what the professor is delivering or doing)

Faculty members design assignments that promote critical thinking, integration, reflection, synthesis. They give students assignments and activities that encourage students to “suspend judgment, maintain a healthy skepticism, and exercise an open mind”; professors design activities that call for the “active, persistent, and careful consideration of any belief in light of the ground that supports it.” [Taken from John Dewey’s How We Think: A Restatement of the Relation of Reflective Thinking in the Educative Process (1933). Available at Hamilton Library BF455.D5.1933.]

Discussion Questions for Faculty Members as They Consider a Program Capstone Experience

  1. What framework best meets the needs of the program and its goals?
    • An interdisciplinary, synthesizing experience?
    • A discipline-specific, synthesizing experience?
    • A method to satisfy external industry/professional standard or requirements?
    • A reflective, synthesizing experience?
  2. Is it necessary to satisfy discipline/profession accreditation requirements?
  3. In what ways will the capstone experience be beneficial to the students’ post-baccalaureate experience?
  4. In what ways will the capstone experience support the (relevant) general education requirements?
  5. What components of the capstone experience will address students’ personal growth? Academic growth? Professional growth?
  6. How will students be guided toward and prepared for the capstone experience? What program structures will be in place (e.g., course requirements, pre-requisites, advising)?


 
Using a Capstone Experience for Program Assessment

When using a capstone experience for program assessment, the standard assessment loop is followed: establish outcomes, create learning opportunities, undertake an assessment process, interpret assessment results, and create and implement an action plan for improvement.

Establish Student Learning Outcomes & Determine Learning Opportunities

  • The program creates desired Student Learning Outcomes (SLOs) [How to develop outcomes].
  • The program deliberately incorporates learning opportunities—activities and assignments—into the curriculum and capstone experience so that students can achieve the desired SLOs. Typically, the program can assess all or nearly all program SLOs using the capstone experience. A well-designed curriculum plotted on a curriculum map illustrates how and where SLOs are introduced, reinforced, and then mastered and demonstrated in the capstone experience. [How to create a curriculum map].

Assessment Process: Collecting and Evaluating/Analyzing Evidence

The assessment process should lead to a discussion of the program as a whole—not only a discussion of the capstone experience.

The program typically builds evidence-generating, -collecting, and -evaluating mechanisms into the capstone experience.

Collection of Evidence: Student Learning

When designing the capstone experience, programs build in assignments/activities that can shed light on the SLOs and relevant general education outcomes. The assignments/activities serve as evidence of student achievement. Students can complete them over time [How to develop portfolios] or in a single, culminating course.

Examples of assignments/activities:

  • Written documents: research report, thesis, proposal, case study, project report, reflective essay, review of the literature, resume, progress reports, informal writing (notes, lab notebook, observation log, informal analyses, academic journal, etc.)
  • Oral presentation(s)
  • Poster presentation(s)
  • Documentation of group work: peer review/feedback, group progress reports, evaluation of group members/group effectiveness
  • Internship supervisor’s evaluation/feedback on student performance
  • Interview (e.g., mock job interview, oral defense)
  • Meeting facilitation (e.g., students facilitate a community meeting)
  • Exam(s) (locally-developed, state, or national)

Evaluation of Evidence of Student Learning

Most capstone experiences include a senior-level course. The course instructor can assist the students in preparing evidence for evaluation.

  • Good practices:
    • Take a (random) sample of students and evaluate their work for the purposes of program assessment.
    • Use a rubric to evaluate qualitative materials such as written reports, short-answer exam questions, oral presentations, etc. [How to create a rubric]
      • Provide the rubric to the students.
      • Have professors use the rubric in other courses that introduce or reinforce the SLO.
      • Have each professor apply the rubric in the same way. Use examples of student performance at varying levels of mastery to calibrate professors/reviewers.
    • Have at least two faculty members evaluate the evidence using criteria agreed upon by the faculty (e.g., use an agreed-upon rubric).
    • Have external faculty members and/or business community members evaluate the student work.

Collection of Evidence: Student Perceptions

  • Programs can also collect evidence of students’ perceptions in capstone experiences. Examples of data-collection methods:
    • End-of-course or end-of-program surveys
    • Exit interviews
    • Focus groups
    • Blogs

Evaluation of Evidence of Student Perception

  • Quantitative data (e.g., Likert scale data) can be summarized using descriptive statistics.
  • Open-ended survey responses, interview data, focus group data, blogs, etc.,  can be analyzed using qualitative methods to identify themes and areas of consensus.

Assessment Results

The goal of assessment is to provide the program with information it can use to be self-reflective and self-improving. Presenting the results does not need to be complicated. A simple, straightforward presentation of who, what, where, when, and how often suffices. [How to report results] Discussion of the results should focus on the SLO(s) and the program, not on individuals.

Action Plan for Improvement

Programs use assessment results to guide program decision making and improve their effectiveness. By periodically discussing assessment results and procedures, faculty can plan and implement improvements to the capstone experience. The focus should be on the program and the learning opportunities (e.g., activities throughout the program, all required courses, etc.), not on individuals.

Assessment: Rubric & Checklist

The Western Association of Schools and Colleges (WASC) created a rubric to assess programs that use the capstone experience as the basis for program assessment. Programs can use the WASC rubric to self-assess their progress. Below is a checklist based on the WASC rubric.

Checklist. The program has:

1. Identified the relevant program SLOs that will be assessed using the capstone experience.
2. Identified lines of evidence and routinely collect that evidence
3. Developed explicit evaluation criteria (e.g., rubrics)
4. Identified examples of student performance at varying levels of mastery for each outcome.
5. Pilot tested and refined evaluation criteria (e.g., rubrics). Used feedback from external reviewers to improve the assessment process; used external benchmarking data.
6. Informed students of the evaluation criteria.
7. Calibrated those who apply the evaluation criteria and routinely check inter-rater reliability.
8. Informed students of the purpose and outcomes of the capstone and students embrace the capstone experience.
9. Made information about the capstone readily available.

1. Sources
“Toward a Model for Capstone Experiences: Mountaintops, Magnets, and Mandates” by C.J. Rowles, D.C. Koch, S.P. Hundley, & S.J. Hamilton. Assessment Update, Jan/Feb 2004, 16(1) [Available online via Hamilton Library.]
“Capstone Experiences and Their Uses in Learning and Assessment,” workshop by S.P. Hundley, Assessment Institute (sponsored by IUPUI), October 2008.