Unit: History
Program: History (BA)
Degree: Bachelor's
Date: Thu Sep 30, 2010 - 12:06:38 pm

1) Below are the program student learning outcomes submitted last year. Please add/delete/modify as needed.

Undergraduate Student Learning Outcomes (SLOs):

1) Students can explain historical change and continuity.

2) Students can develop a clear argument using recognized historical methods.

3) Students can write clear expository prose and present their ideas orally according to disciplinary conventions.

4) Students can interpret and use primary sources.

5) Students can identify the main historiographical issues in a specific area of concentration.

2) As of last year, your program's SLOs were published as follows. Please update as needed.

Department Website URL: http://manoa.hawaii.edu/history/undergraduate
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: Some individual faculty share departmental SLOs on their syllabi.
Other: Course listings on departmental website: http://manoa.hawaii.edu/history/courses
Other: Our curriculum map is also published: http://manoa.hawaii.edu/history/undergraduate

3) Below is the link to your program's curriculum map (if submitted in 2009). If it has changed or if we do not have your program's curriculum map, please upload it as a PDF.

Curriculum Map File(s) from 2009:

4) The percentage of courses in 2009 that had course SLOs explicitly stated on the syllabus, a website, or other publicly available document is indicated below. Please update as needed.


5) State the assessment question(s) and/or goals of the assessment activity. Include the SLOs that were targeted, if applicable.

During this academic year, we set out to assess SLO #4: "Students can interpret and use primary sources."  This is part of our five-year assessment plan, included below.  Included in parentheses are the courses from which we will take representative samples of work.

2009-10: SLO #4 (496 in fall, 400-level in spring)

2010-11: SLO #5 (496 in fall, 396 in spring)

2011-12: SLO #3 (496 in fall, 400-level in spring)

2012-13: SLO #3, continued (300-level in fall); SLO #1 (496 in spring)

2013-14: SLO #2 (496 in fall, 300-level in spring)

6) State the type(s) of evidence gathered.

In Spring 2009, we collected papers from HIST 496, students' capstone senior thesis course, in which they produce a lengthy piece of original work based on primary sources.  We analyzed these papers in Fall 2009, and also collected a representative sample of papers that semester from 400-level history classes to analyze in Spring 2010.

7) Who interpreted or analyzed the evidence that was collected?

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)

8) How did they evaluate, analyze, or interpret the evidence?

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)

9) State how many persons submitted evidence that was evaluated.
If applicable, please include the sampling technique used.

In Spring 2009, we collected 12 papers total from four instructors of HIST 496 (3 each).  We asked for papers that were broadly representative of exemplary, medium, and poor work.  In Fall 2010, we revised our sampling technique.  We asked instructors who assigned research papers to submit a representative sample of student accomplishment specifically related to primary sources (i.e., our targeted SLO).  That is, if most of the class produced 'medium' papers in regards SLO #4, then we asked to receive mostly 'medium' papers, as opposed to samples spanning the full range of achievement.  We collected 15 papers from instructors of 400-level courses using this method. 

10) Summarize the actual results.


Our hope was to find that 75% of the 496 final papers would be either “Competent” (Level 3) or “Accomplished” (Level 4) using our scoring rubric (see more below). In general our expectations were met, but there is clearly much more work to be done within the department on the task of integrating primary sources into our teaching and into students' written work. None of the papers, for example, met our criteria for an “Accomplished” research paper, although 7 of the papers were deemed “Competent.” Of the other 5 papers, 3 were “Developing” and 2 were “Beginning.” In sum, slightly under 75% of the papers were satisfactory, a total that narrowly failed to meet our anticipated goal.


Committee members drew several conclusions from these results:

  • That faculty needed to place greater emphasis on the use and importance of primary materials in their instructions to students in 496
  • That the use of primary sources in 496 depended a great deal on issues of topic selection and focus, about which students need much more guidance
  • That our own rubric for evaluating the interpretation and use of primary sources needed some refining

Consequently, we have revised the scoring rubric for SLO #4 (see below) and we will try to develop this further as we continue our review of the History undergraduate program, and as we try to provide feedback to faculty about how to improve student performance.


Interpretation and Use

4 – Accomplished

Crafts argument based on primary sources

Recognizes historiographical significance of argument

3 – Competent

Interprets primary sources

Develops an argument not based on primary sources

2 – Developing

Includes some primary sources, but fails to interpret them

Fails to develop an argument

1 – Beginning

Uses incorrect or few primary sources


In the spring, we applied our new rubric to the papers we collected from 400-level courses.  We set as a goal that two thirds of papers should be at the "developing" level, a different success goal because of 400-level courses' different place in our curriculum map than HIST 496.  In the end, we determined that 11/15 papers met this criterion, which was satisfactory.  The total results: one paper at level 4, 3 at level 3, 7 at level 2, and 4 at level 1 (beginning).  We did determine that we collected papers from one course which assigned a research paper but which did not require the use of primary sources; if these five papers were removed from the sample, the overall performance was 9/10 at a 'developing level' (totals: one paper at 4; 3 at 3; 6 at 2; and 1 at 1). 

These results aside, what was more striking to the committee was the fact that only five of the 16 400-level courses in the department in Fall 2010 required a research paper. 

11) How did your program use the results? --or-- Explain planned use of results.
Please be specific.

The results of last year's assessment activities were reported to the department at large at our first departmental meeting this fall.  We shared the specific recommendations listed above from the Fall 2010 assessment activities, and also, based on our findings in the Spring, recommended that faculty assign more research papers utilizing primary material in their 400-level classes.  The rationale we suggested to the faculty is that this is the only way to ensure that students are prepared to write their senior thesis in HIST 496. 

12) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, program aspects and so on.

Beyond all the ones discussed in previous answers, we continued to make many additional conclusions and discoveries which assist our individual pedagogical approaches. 

13) Other important information:

We expanded the Assessment Committee in the History Department to six members (which included changing the Departmental Manual), collected material to assess this fall, and began drafting new scoring rubrics for upcoming assessment activities.  We also continued to discuss the best ways to disseminate our findings to the department as a whole.