Unit: Academy for Creative Media
Program: Creative Media (BA)
Degree: Bachelor's
Date: Wed Oct 08, 2014 - 2:47:12 pm

1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)

1. Critical Thinking: Constructively critique their own and other’s intellectual and creative work.

(2a. Think critically and creatively)

2. Writing: Write a creative work that tells a story.

(2a. Think critically and creatively, 2c. Communicate and report)

3. Writing: Write a critical piece that applies theoretical principles.

(2b. Conduct research, 2c. Communicate and report)

4. History and Aesthetics: Know the intellectual history of cinema and place their work within that history.

(1b. Specialized study in an academic field)

5. Professional Skills & Creativity: Create a visual narrative through application of appropriate principles and production skills [production & animation]

(1b. Specialized study in an academic field, 2a. Think critically and creatively, 2c. Communicate and report)

6. Professional Skills & Creativity: Conduct and communicate original research findings [critical studies]

(2b. Conduct research)

7. Professional Skills & Creativity: Understand the essential collaborative nature of creative productions by working as a team member.

(3a. Continuous learning and personal growth)

8. Ethics and Responsibility: Understand and articulate the role and rights of a responsible artist.

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: www.hawaii.edu/acm
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: NA
Other:
Other:

3) Select one option:

Curriculum Map File(s) from 2014:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Did your program engage in any program assessment activities between June 1, 2013 and September 30, 2014? (e.g., establishing/revising outcomes, aligning the curriculum to outcomes, collecting evidence, interpreting evidence, using results, revising the assessment plan, creating surveys or tests, etc.)

Yes
No (skip to question 14)

6) For the period between June 1, 2013 and September 30, 2014: State the assessment question(s) and/or assessment goals. Include the SLOs that were targeted, if applicable.

Our goal was to develop and test 3 rubrics for, one for each of the three tracks of study in our department.

7) State the type(s) of evidence gathered to answer the assessment question and/or meet the assessment goals that were given in Question #6.

The evidence was the draft rubrics and faculty evalution forms from completed rubric testing.

8) State how many persons submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Eight faculty members provided completed rubric scores and Rubric Testing Evaluation forms.

9) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

10) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

11) For the assessment question(s) and/or assessment goal(s) stated in Question #6:
Summarize the actual results.

RUBRIC DEVELOPMENT

 

During the Fall 2013 semester faculty in each of the three tracks of the ACM major met and developed rubrics for courses within their tracks.

 

The Digital Cinema Production faculty developed a rubric that was to be tested for ACM 399, which is an independent group study course that serves as a capstone course for many of the Digital Cinema students.

 

The Animation Track faculty met and developed a rubric for use in ACM 420, the capstone course for the Animation Track. However due to the schedule for rubric testing, the rubric was tested on final projects from ACM 320, which is an animation production course offered each spring and leads into ACM 420 in the fall.

 

The Critical Studies Track faculty met and developed a rubric for use in ACM 460, which is one of the upper level Critical Studies courses in ACM.

 

In the Spring 2014 semester student work was identified for use in the testing. In the late spring and summer of 2014 the faculty participants worked independently to score student works according to the rubric of their track and complete a Rubric Evaluation questionnaire at the end of the process.

 

Of the ten instructional faculty that taught ACM courses during the 2013-2014 academic year, eight faculty members responded with feedback from the rubric testing assessment activity.  It should be noted that two faculty had left on sabbatical during the middle of this reporting period.

 

RUBRIC TESTING RESULTS

 

Digital Cinema Track

Five of the Digital Cinema faculty members responded with evaluations from the testing. The data is compiled below. Three of the five faculty members were testing the rubric on ACM 399 projects, and two were testing the rubric on ACM 310 courses. The numerical results of the ACM 399 testing, is compiled in Figure 1. A single faculty reviewer reviewed each ACM 399 project. A total of 8 projects were reviewed. The Digital Cinema rubric used a numerical scale of 1-20 with 20 being the highest rating. There were 5 categories, Form/Style, Content, Technical Quality, Process and Overall Effectiveness. The total possible points for each film were 100. This link leads to the rubric that was tested. 

 

The rubric evaluation survey summaries from all five respondents are presented in Table 1.

 

Figure 1

 

The learning outcome totals and overall point totals for each ACM 399 project are shown here.

 

Table 1. Summarized results from rubric evaluation form

ACM 399 Rubric Strengths

  • Descriptions of Form/Style, Content and Technical Quality are clear
  • Assessment categories very complete
  • “Very useful gauge for grading and for getting across to students what’s expected to succeed with their creative projects”
  • Makes the evaluation process easy
  • Helpful to divide points between the categories

ACM 399 Rubric Weaknesses

  • Does not address the role of crew members on group projects
  • Not sure how the points in the rubric translate into grades
  • Rubric may work for director or producer but not other roles in film
  • Numerical grade range too wide, “According to the rubric a student can be doing exemplary work and manage 75 possible points”; too harsh
  • Form/Style category needed to differentiate between screenplay and cinematic execution (too vague and over encompassing)
  • Evaluative criteria too lengthy and difficult to navigate
  • Too much flexibility within each quantitative level, need to create a sub-rubric to differentiate between 15-20 for example
  • Point spread in each level is too wide
  • In the category “Process” timeliness is given too much importance as was group process

 

Recommended Changes to ACM 399 Rubric

  • Change to mathematics of rubric “If something is incomplete what does it mean to give 0-4 points?”
  • Exemplary work should be 17 and above – proficient work below 17
  • Break Form/Style category down more into Cinematography, Editing, Sound, Directing etc.
  • Design/interface of rubric should be self-explanatory, easy to understand and use
  • Each level equal to one point “ for instance, Incomplete=1, Partially=2, Proficient=3, Exemplary=4”
  • Listed suggestions for alternate wording/phrasing throughout rubric

 

 

Animation Track

The two animation-track faculty responded to the rubric testing with numerical data and completed evaluations.  The Animation Track chose 8 films randomly from the 14 projects of the Spring 2014 ACM 320 class. Each faculty rated the same set of eight films. The rubric consisted of 4 categories: Filmic Writing and Storytelling, Professional Skills in Animation, Professional Skills in Design and Ethics. The rating scale was 1-5 with 1 representing the highest achievement. The data is presented in the graph in Figure 2. The questionnaire summaries are presented in Table 2. A link is provided to the ACM 420 rubric.

 

Figure 2

 

Fig. 2 shows the total score of each student in all 4 categories (the lowest score shows the highest achievement)

 

Table 2

ACM 420 Rubric Strengths

  • Helpful to separate out the elements of story, design, animation and ethics
  • Animation category had detailed levels
  • Design and concept categories were thorough

ACM 420 Rubric Weaknesses

  • Filmmaking concepts missing from rubric
  • Audio needs to be addressed in rubric
  • Technical aspects of final output need to be addressed

Recommended Changes to ACM 420 Rubric

  • Make changes to compensate for weaknesses that have been noted

 

Critical Studies

Two faculty members tested the critical studies rubric. One faculty member tested the rubric on a sample of fifteen students that were selected to present a broad range of academic performance in ACM 460 a senior level ethics course. The second faculty member tested the rubric on a group of students representing projects produced in ACM 350 and ACM 360. The rubric testing evaluation feedback can be found in Table 3. A link to the ACM 460 rubric is provided.

 

Table 3

ACM 460 Rubric Strengths

  • Gives a good idea about the quality and learning outcomes of student work

ACM 460 Rubric Weaknesses

  • No strengths in this approach to evaluation
  • No “one size fits all” that can be used across the board in this way
  • Pushes toward micromanaging of courses
  • Current assessment method too different from this model to make it useful
  • Too inflexible of a system to allow for each student as an individual

Recommended Changes to ACM 460 Rubric

  • The rubric system is at odds with treating students as unique individuals and providing tailored learning experiences

 

12) State how the program used the results or plans to use the results. Please be specific.

The ACM program plans to use the results from this process in the following ways. First discussions have begun among the faculty as a whole as to the effectiveness of the rubrics and possible changes that need to be made. The Cross-Track rubric testing helped identify common areas that all tracks share and provided a basis to examine track competencies against departmental learning outcomes and institutional learning outcomes. The rubric evaluation process has led to discussions regarding the Departmental Curriculum Map. All of these discussions are informing faculty members currently engaged in ACM program review activities.

Based on the results of the rubric testing immediate steps are being taken to revise the rubrics and include the revised rubrics in the relevant course syllabi in the coming semesters. Current discussions center around how this may aide student understanding of course or instructor expectations at the beginning of the course.

13) Beyond the results, were there additional conclusions or discoveries?
This can include insights about assessment procedures, teaching and learning, program aspects and so on.

Through many discussions that took place in faculty meetings as well as email exchanges, it is clear the way rubrics are to be implemented is still being considered. Not all participants are comfortable with creative work being assessed in a numerically quantifiable process. The discussion continues over whether quantifying a series of qualitative learning objectives is advisable. However, it is apparent through discussion that there is a need to quantify some aspects of creative learning outcomes in order to make informed programmatic decisions. Part of the challenge ACM faces is the diversity of its 3 tracks of focus and the nature of the creative products associated with filmmaking. Through this process we have found a valuable juncture whereby discussions about learning outcomes and the overall curriculum map may now begin.

14) If the program did not engage in assessment activities, please explain.
Or, if the program did engage in assessment activities, please add any other important information here.

Our two faculty members on leave and the impending cut for the departmental lecturer budget for Spring 2015 are impacting our current assessment activities. We plan to proceed with revision of the three rubrics this fall, continue with the development of additional course rubrics in Spring 2015 and to promote the use of the three revised rubrics in course syllabi in Spring and Fall 2015.