Assessment Shorts: Examples of Program Assessment in Action

Foundations Written Communication Program

Improving the Foundations Written Communication Program One Outcome at a Time

     Erica Reynolds Clayton, an assistant professor of English who specializes in composition rhetoric, views program assessment as “a complicated intellectual process . . . almost like a forensic science [that] can be honed with practice.” But as the Chair of the Assessment Committee for the English Department and tasked with assessing Student Learning Outcomes for the General Education Foundations--Written Communication (FW) program, she acknowledged they had to tread lightly in order to share that sense of intellectual curiosity for assessment with other English Department faculty. To make headway, they began by focusing on one of the four FW learning outcomes.

     Their program assessment study of outcome #1, which dealt with writing for a particular audience and a specific purpose, was designed to provide information to those who teach FW courses and to be published so other universities could benefit. Following standard research procedures, the Institutional Review Board approved the assessment study and students signed consent forms. To collect data on outcome #1 the assessment team asked the 40+ instructors teaching FW sections to collect student work and require a common in-class writing assignment. The team was pleased when 100% of the instructors came on board. “They all collected what their students felt to be the best paper they had written in terms of specific audience and specific purpose,” Erica said, adding with pride that “the faculty who were involved were, I think, really illuminated.”

     The assessment team gathered graduate students and faculty from the departments of English and Second Language Studies to score the student writing in two evening sessions. The assessment team, with the help of the Assessment Office, had created two rubrics that the graduate students and faculty used to evaluate the student writing. To grade the student papers consistently, the scorers trained and practiced, “We used actual student papers that we had accrued to test against the rubric,” Erica said, resulting in “a high rate of inter-rater reliability.” One of the unexpected benefits of the rubric came when some instructors began disseminating it to students. “In English you are sometimes subject to grade grievances . . . because of the subjectivity inherent in providing feedback and grading students papers,” Erica said. “Rubrics are a way for the students and the teachers to be accountable to each other and have it right out there in front of them.”

     The program assessment study of outcome #1 was used, in part, to investigate possible effects of the different versions of FW courses. Therefore, the results were disaggregated by course type: standard English 100 class, the mentored version of English 100, the Honors English 100 class, English 101, and the English Language Institute 100 class. Results showed differences among the versions with students in the mentored version of English 100 scoring higher on average than those in non-mentored sections. “Students in mentored sections actually thought a lot more meta-cognitively . . . in terms of knowing who the audience was and what the purpose was,” Erica said.

     The program assessment results were disseminated at a colloquium attended by faculty and guests. The assessment team gathered suggestions for an improvement plan from the audience. “We talked about the results and then brainstormed the kinds of assignments we could employ to garner students’ focus in terms of purpose and audience,” Erica said. The process fostered intra- and inter-departmental communication and resulted in a new web page that has become a portal of information for the faculty who teach FW classes. “I do think that there is more communication and conferring going on,” Erica said. “This burgeoning sort of atmosphere where instructors are conferring with other instructors . . . wanting to seek out new ways to structure their curriculum.”

     After this promising start, outcome #4 was assessed the following year. However, results on outcome #4, which addresses students’ information literacy skills, fell short of expectations. “We were pretty disappointed,” Erica confided. “You could really see how the students were struggling.” But Erica remained optimistic that the results would spur program improvement. “I think that it will encourage teachers or instructors to design assignments that will really get at our SLOs,” she said. “If [students are] doing nothing but writing haikus all day or ‘what I did at grandma’s house last summer’ we aren’t preparing them” in the area of information literacy.

     The assessment of outcome #4 did encourage those teaching FW classes to make full use of the information literacy workshops offered by UHM Libraries. “They have a really great program,” Erica noted, adding “the majority of our professors did take their students to at least one seminar so that was a positive thing.” Looking forward, Erica is confident that re-testing the informational literacy SLO in three year’s time will show evidence of improvement. “Having these SLOs on the syllabi, everybody in the department knows . . . they’re going to have to be [tasking] some kind of a research paper that would allow for students to exhibit their knowledge with relation to information literacy . . . I think that’s positive in itself.”

     Program assessment has been an enlightening learning experience for Erica Reynolds Clayton. “I really think it’s just a matter of people getting exposure to assessment practices because I’ve seen people really turn their ideas around about assessment just being exposed to it,” she said. For some faculty, Erica believed it was just a matter of becoming fluent in a new vocabulary. “This is a tool that they’re already [using] but we’re just using language now to talk about it,” she said. The process of strengthening the FW program’s assessment practices has confirmed for Erica not only that the English Department’s student outcomes are strong, but also that the FW program is achieving what it set out to do through assessment: “improve [students’] chances for succeeding in their future classes.”


Foundations--Written Communication

Keys to Success

  • Begin simply so everyone is encouraged to get involved.

  • If you need help structuring a rubric, lean on the Assessment Office for guidance.

  • Keep assessment a routine part of the academic year.

Types of Evidence
  • Students selected and submitted the piece of writing they believed best demonstrated their ability with regard to the learning outcome.

  • In-class, reflective essay in which students discuss how their writing meets the learning outcome.

Quick Tips




General Education Foundations Written Communication (FW)