Unit: Communications
Program: Communication (BA)
Degree: Bachelor's
Date: Mon Oct 08, 2012 - 12:38:23 pm

1) Below are your program's student learning outcomes (SLOs). Please update as needed.

Abstracted from the Catalog & the Website: The undergraduate program reflects the department's commitment to the mission of the College of Social Sciences: to offer courses that provide students with a sound understanding of fundamental communication processes in contexts ranging from formal organizations to the community, and the society at large. The program also provides students the opportunity to select courses that allow them to specialize in a variety of interest areas within the field, including interpersonal communication, intercultural communication, international communication, organizational communication, information and communication technologies, telecommunication and multimedia production. Specialization pathways can be self-selected or chosen in consultation with a faculty advisor.

Among the learning outcomes we anticipate are that students can:

  1. Design communication and media projects to make meaningful contributions to diverse social, professional or academic communities.
  2. Reflect critically on communication products such as media productions, research and policy reports and everyday texts.
  3. Demonstrate preparedness for academic and professional careers in communication.
  4. Communicate effectively to a variety of audiences, orally, in writing, and through digital media.
  5. Demonstrate global awareness, including an awareness of cultures in the Hawaii-Pacific region and issues related to cross-cultural communication.
  6. Engage in collaborative problem solving, both face-to-face and in online environments.
  7. Analyze the ethical dimensions of communication.
  8. Critically evaluate the use of technology in communication.

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: http://www.communications.hawaii.edu/com/index.html
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online: NA
UHM Catalog. Page Number: 108
Course Syllabi. URL, if available online: http://socialsciences.people.hawaii.edu/esyllabi/index.cfm
Other: Most faculty post syllabi on their own UHM websites
Other: Department website --http://www.communications.hawaii.edu/com/index.html

3) Select one option:

Curriculum Map File(s) from 2011:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.


5) Did your program engage in any program assessment activities between June 1, 2011 and September 30, 2012? (e.g., establishing/revising outcomes, aligning the curriculum to outcomes, collecting evidence, interpreting evidence, using results, revising the assessment plan, creating surveys or tests, etc.)

No (skip to question 14)

6) For the period June 1, 2011 to September 30, 2012: State the assessment question(s) and/or assessment goals. Include the SLOs that were targeted, if applicable.

In September, 2012, we completed our first annual assessment report for the new curriculum.

The summary report  (created by Prof. Tom Kelleher) follows:

University of Hawaii School of Communications

2012 Assessment Panel Report

ICTs & Policy Capstone Track


Four new capstone project courses in the Communication B.A. program, taught for the first time in spring 2012, challenged students to demonstrate mastery of the curriculum by creating an original project. The capstones also included creation of an electronic portfolio as evidence for assessment. While the specific projects vary, each portfolio includes written reflection and electronic artifacts created in pre-requisite courses or through capstone assignments.

This fall, as outlined in our prior five-year strategic plans, we convened our first assessment panel to assess the ICTs and policy track. Our plan calls for one track to be assessed each year in a four-year rotation.

The panels are to include alumni, professionals in the field, and faculty representing other capstone tracks.

After reviewing a census of 15 electronic portfolios from COM 479, Capstone in ICTs and Policy (offered spring 2012), our first panel met for three hours on September 14, 2012 to discuss the outcomes and offer feedback.

The 2012 Panel

Christina Higa is associate director of the Telecommunications and Social Informatics Research Program (TASI) and co-principal investigator of the Pan Pacific Education and Communication Experiments by Satellite (PEACESAT). She is an alumna of our program (B.A. 1991, M.A. 2002).

Cassandra Harris is social media specialist for the Hawaii Senate. She is also an alumna of our program (B.A. 2007, M.A. 2010).

Patricia Buskirk is assistant professor in the School of Communications. She anchors the multimedia track in the COM B.A.

Tom Kelleher is professor and chair of the School of Communications. He anchors the communication-in-communities track in the COM B.A.

Jenifer Winter is assistant professor in the School of Communications. She anchors the ICTs & policy track in the COM B.A. She also designed and taught the capstone under review in this round. In that capacity, she met briefly with the panel to discuss the track and the course, but was not involved in producing the review below.

*Professor Emeritus Dan Wedemeyer, who was instrumental in the development of the ICTs & policy track, and Associate Professor Marc Moody, who anchors the digital cinema capstone track, were unavailable to participate this semester.

Feedback on SLOs

SLO1 Design communication and media projects to make meaningful contributions to diverse social, professional or academic communities.

We identified a range of performance on the first SLO from “not quite developed yet” to “ready to go” with professional-quality work. This applied to contributions that we considered to be primarily professional (including video projects and policy documents) as well as academic (research papers). In both contexts, we were satisfied that the large majority of students were “ready to go.” Even if the assignments that students completed may not be publication/professional quality, they showed evidence that most students are ready to go work on this type of assignment in a professional setting. For example, we wouldn’t expect that the proposed bill for a Fair Internet Act would be ready for introduction to the Senate, but we do see it as strong evidence that the student is well-prepared to work on this type of project in a professional context.

SLO2  Reflect critically on communication products such as media productions, research and policy reports and everyday texts.

In general, students did well selecting issues or problems to be discussed and describing them. The topics chosen by students were stated, described, and clarified in a manner that evidenced good understanding. In terms of analysis and synthesis, many of the students integrated a wealth of literature to inform their papers. The paper on SNS usage during the presidential election and the paper on CMC and romantic relationships were cited as positive examples.

One rubric area that was a bit lacking was the questioning of experts. In many cases, students presented well a particular position or thread of research, but did not elaborate alternative or conflicting viewpoints.

SLO3  Demonstrate preparedness for academic and professional careers in communication.

The range of student endeavors within the ICTs & policy track impressed us, as did the evidence of how students have developed self-awareness of these interests.

For professional careers, we felt that almost all of the students are hirable in related fields, but that some may be limited in their contributions by their writing skills. These students are mature enough and knowledgeable enough, but may be dependent on others for editing and assistance with written reports and assignments. In some jobs this will be more of a challenge than others.

For academic careers, the next step would likely be a master’s program. We felt that somewhere between one third and one half of this cohort is prepared for that option.

One factor that we believe is essential to consider in assessing career preparedness is life experience. Clearly some of these students already have launched careers while others may have entered the program a year or two after high school. This affects the range of preparedness in the group. While some students likely moved from “marginal” to “proficient,” others may have entered the program already “proficient” or even “exemplary” in career preparedness. We didn’t try to specifically assess each student’s longitudinal progress through the program, but it appeared that students from across the broad range of preparedness entering the program were all able to benefit (i.e., good proximal development).

SLO4  Communicate effectively to a variety of audiences, orally, in writing, and through digital media.

There was some uncertainty within the panel on how this rubric item was to be interpreted (see process suggestions below). When we looked at the learning outcome as a matter of how well students communicated in ‘all of the above,’ (i.e., orally, in writing, and through digital media), we were impressed with the variety within the whole sample of portfolios that ranged from written reports to PowerPoint presentations to video productions to Web sites to transmedia campaigns. However, at the individual portfolio level, the degree to which each student appeared to achieve SLO4 very much depended on the items each chose to include in his or her portfolio as evidence.

When SLO4 was interpreted more with more of an either-or logic (i.e., orally, in writing, or through digital media), we focused more on the text-based ICT- and policy-specific written projects. In looking at these mostly text-based reports as evidence we felt that the projects included as evidence did indicate adequate to thorough understanding of audience, context, and purpose.

Writing issues were a stumbling block for some. The evidence in the sample covered the whole range of the rubric: from students using language that “sometimes impedes meaning because of errors” on one end to students using “graceful language that skillfully communicates meaning” on the other. 

SLO5  Demonstrate global awareness, including an awareness of cultures in the Hawaii-Pacific region and issues related to cross-cultural communication.

A large majority of students included at least one good piece of evidence in this category. This evidence came from assignments that specifically asked students to consider global and/or cultural perspectives. These papers focused on international locations, mobile ethnography projects, and other projects from elective courses such as Gender and Media (COM 444).

SLO6  Engage in collaborative problem solving, both face-to-face and in online environments.

Although students did include good examples from group projects, and some students elaborated on their role in the projects, the panel found it difficult to gauge student engagement in live group processes and outcomes. We recommend assessing this SLO more at the course level where instructors can observe group dynamics more directly.

SLO7  Analyze the ethical dimensions of communication.

Although ethical analysis could be inferred from some of the portfolio materials, formal analysis of ethical dimensions was lacking in many portfolios. The panel noted that COM 460, Media Ethics, is only offered to a limited number of students each semester and is not required in any of the capstone tracks. Short of requiring all students to take an ethics course, the panel recommends a continued emphasis on ethics in other courses. The “Identifying potential portfolio components” handout for students indicates that certain assignments from COM 432 and COM 438 include the application of ethical frameworks. In future portfolios, it may be good to ask the students to explain how they applied and analyzed ethical concepts (see process suggestions below).

The panel also discussed examples of ethics assignments that could be integrated into other tracks in the future, such as a “creative consequences” assignment in production courses in the media arts courses.

SLO8  Critically evaluate the use of technology in communication.

This SLO fit particularly well with the ICTs & policy track. The panel found an abundance of evidence across assignments and across portfolios indicating that this is a strong point in the program.

Process Suggestions

These suggestions can be elaborated upon and discussed more among faculty face-to-face, but the following list summarizes the panel’s suggestions for areas of improvement in the process.

·         Simplify the rubric. The panel found the rubric to be a bit too granular for general analysis of the program. A simpler rubric would suffice as a framework for program-level feedback and summary of performance on SLOs. Carefully reviewing each entire portfolio page-by-page and assignment-by-assignment and then thoughtfully considering each SLO at the subtle levels of distinction between rubric categories was a tedious task.

·         Some of the SLOs could be condensed/combined or assessed in a different forum. For example, consider combining SLO1 and SLO4. SLO6 could be assessed directly in the classroom context. Some students did self-assess SLO6 better than others, but even in those cases it was difficult for the panel to judge how well students met deadlines, moved projects forward, articulated alternate viewpoints to team members, etc.

·         Make clearer to panelists the purpose of the assessment from the beginning. Although the invitation to participate indicated that it was not the panel’s job to “grade” the students individually, the rubric seemed to be designed as such. Panelists noted that in some ways the student instructions (“Identifying potential portfolio components” document) was more useful than the rubric. It showed specifically how projects submitted as evidence could indicate the SLOs.

·         More consistency in the presentation of portfolios would be helpful. This applies to how the portfolios are packaged as well as how they appear to panelists working with different versions of the software. Specifically, the “i” buttons for more information describing various pieces were very useful in some portfolios because the students used this space to put the evidence in context. However, not all students did this, and not all panelists were aware of this function. This concern leads to the next two suggestions.

·         The panel could meet twice. The first meeting would be to explain the process and purpose of assessment, demonstrate the software, answer questions about SLOs, and make sure everyone has access to all the files. The second meeting would be to discuss the evidence and offer program feedback. On a related concern, the time between these two meetings should be at least two weeks to allow adequate time for reviewing the files.

·         The panel found the introductory statements to be the most useful and informative components in many cases. Consider asking students to elaborate with these, describing how and why they chose each piece of evidence in relation to specific SLOs. This is not just busy work, but would itself be an integrative assignment for students to be challenged to do what the panel was asked to do – assess holistically what they have learned from the program by putting all the pieces together.

7) State the type(s) of evidence gathered to answer the assessment question and/or meet the assessment goals that were given in Question #6.

Please see 6) for complete report. 

8) State how many persons submitted evidence that was evaluated. If applicable, please include the sampling technique used.

The complete ICTs and Policy capstone (one of four capstones) was assessed. This totalled 15 students.

9) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)

10) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)

11) For the assessment question(s) and/or assessment goal(s) stated in Question #6:
Summarize the actual results.

Please see 6) for full report.

12) State how the program used the results or plans to use the results. Please be specific.

We are making some process changes (e.g., revising the rubric, instructions to students, etc.). We are also discussing a revision of the SLOs to make them more explicit for assessment. These are outlined in question 6)

13) Beyond the results, were there additional conclusions or discoveries?
This can include insights about assessment procedures, teaching and learning, program aspects and so on.

Overall, the first assessment panel went very well and we gained useful feedback. We also felt that inviting alunmi and other professionals to be part of the assessment team strengthened our community relationships.

14) If the program did not engage in assessment activities, please explain.
Or, if the program did engage in assessment activities, please add any other important information here.

We have also been having a number of ad hoc faculty discussions throughout the year. We had one formal meeting with the majority of faculty meeting to discuss assessment.