Unit: English
Program: English (BA)
Degree: Bachelor's
Date: Sat Oct 10, 2009 - 6:31:37 pm

1) List your program's student learning outcomes (SLOs).

Students develop advanced skills as readers, writers, and interpreters of texts across a variety of genres and rhetorical situations and recognize Hawai’i’s geographic and cultural location in the Pacific as part of a challenging program in literary and cultural studies, English language studies, composition and rhetoric, and creative writing,

2) Where are your program's SLOs published?

Department Website URL: http://www.english.hawaii.edu
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number: 115
Course Syllabi. URL, if available online:
Other: English Department Mission Statement and Strategic Plan, p. 2 at http://www.english.hawaii.edu/users
Other:

3) Upload your program's current curriculum map(s) as a PDF.

No map submitted.

4) What percentage of courses have the course SLOs explicitly stated on the course syllabus, department website, or other publicly available document? (Check one)

0%
1-50%
51-80%
81-99%
100%

5) State the SLO(s) that was Assessed, Targeted, or Studied

English 100 Information Literacy (FW SLO #3):    Compose a text that makes use of source material that is relevant and reliable and that is integrated in accordance with an appropriate style guide.

6) State the Assessment Question(s) and/or Goal(s) of Assessment Activity

How well are students achieving the outcome?  To what extent do they supply an outside source when needed; select germane material; select source material from authoritative soruces; and follow rules for citing in-text source material and creating a reference list.

7) State the Type(s) of Evidence Gathered

Student papers written in English 100 spring courses that were supposed to illustrate accomplishment of the SLOs being assessed.

8) State How the Evidence was Interpreted, Evaluated, or Analyzed

An English Department faculty/G.A. team assisted by the UHM Assessment Office applied a rubric drafted by the English Department Assessment Committee and then revised and finalized by our Assessment Coordinator and the UHM Assessment office.

      Scoring team:  Professors Erica Clayton, Mark Heberle, Ruth Hsu and graduate candidates Holly Bruland, Philip Drake, and Annette Priesman Prepping team:  Monica Stitt-Bergh and Marlene Lowe of the UHM Assessment office 

9) State How Many Pieces of Evidence Were Collected

86 student papers—these were selected at random, 2 per section, from a collection of hundreds of papers submitted anonymously by spring English 100 instructors

10) Summarize the Actual Results

Overall, only 6% of student writers were “well-prepared” enough to fully satisfy criteria for information literacy; 48% were “prepared”; 26% were “partially prepared”; and 21% were“not prepared”  (Four separate skills were evaluated, each with its own results, with “relevancy” and “credibility” of sources being the strongest areas, “adherence to citation rules” the weakest, and “makes use of sources [where necessary]” between these extremes

11) Briefly Describe the Distribution and Discussion of Results

The Department’s 2008-2009 and 2009-2010 Assessment Coordinators and the 2009-2010 Departmental Assessment Committee

 

Initially, among UHM Assessment officials and the two departmental coordinators, who also discussed what is to be done next

12) Describe Conclusions and Discoveries

The faculty team who scored the texts were disappointed in the quality of student work in regard to information literacy.  Possible problems included the following:  poor assignment guidelines; lack of practice for students if there is only one assignment that requires source material, so perhaps a number of information literacy activities need to precede or accompany assignments; there was very little use of sources other than to support a fact or summary; source-based personal essays seemed forced and irrelevant to the assignment.  Other suggestions were to have students master the use of signal phrases to introduce sources; use research questions to focus invention and composing of source-integrating papers; teach the difference between summarizing facts and presenting and challenging arguments and claims

13) Use of Results/Program Modifications: State How the Program Used the Results --or-- Explain Planned Use of Results

At present, results have yet to be disseminated to faculty; a department-wide or at least English 100-wide discussion needs to be set up; and FW instructors this spring need to be given examples of good papers and suggestions about improving student performance for this SLO and for the related SLO #4 that will be assessed this year.  The Department Assessment Committee, recognizing the relatively mediocre results, recognizes a need to produce rubrics and calls for papers as early as possible at the end of Fall and beginning of Spring semesters.

14) Reflect on the Assessment Process

Not really—if there time, personnel, and funding sufficient, we would read far more papers than we were able to do last year.

The assessment session was well-organized and revelatory for assessors; the rubric underwent revision several times and ended being very effective in locating skills and helping to uncover their relatively disappointing implementation; what was learned form this assessment will be helpful and valuable for the next

15) Other Important Information

The same procedure, possibly with more papers, can be followed for assessment of the next FW SLO.  The Department Assessment Committee sees the need to have all four General Education-approved SLOs not only included on all English 100 syllabi but also practiced more extensively by students in future semesters.

Beyond direct assessment of FW SLOs, the English Department instituted a policy of collecting and examining faculty syllabi every semester to determine whether or not faculty are meeting minimum expectations and including SLOs.  Collection of 300- and 400-level syllabi and such analysis took place in 2008-2009 along with dissemination of a college-wide survey for all B.A., M.A., and Ph.D graduates in spring 2009, and the results will be used to identify and address apparent weaknesses in the three programs as well as to see what we are doing well. 

16) FOR DISTANCE PROGRAMS ONLY: Explain how your program/department has adapted its assessment of student learning in the on-campus program to assess student learning in the distance education program.

            At present, we don’t have a distance education program, but only occasional distance education courses  taught by a very limited number of faculty

17) FOR DISTANCE PROGRAMS ONLY: Summarize the actual student learning assessment results that compare the achievement of students in the on-campus program to students in the distance education program.