Program: Communication (MA)
Date: Mon Nov 16, 2015 - 10:09:29 am
1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)
1) Below are your program's student learning outcomes (SLOs). Please update as needed.
The goals of the Master's Degree Program in Communication within the School of Communications are to build and exchange knowledge in areas relevant to the broad field of Communication and our specific foci in organizational and intercultural communication, global communication, information and communication technologies, social media, and communication policy and planning. This knowledge is defined in our program as including both sociocultural and sociotechnical perspectives. These goals are supported by our curriculum, research activity, and networking with faculty, fellow students and outside resources. Our SLOs are published in varying wordings in our Program informational brochures, our Program website, our Student Handbook, and--most extensively--described and discussed in our new student orientation presented in August prior to each new academic year.
In 2012, we created a formal list of SLOs.
1. Demonstrate subject mastery in areas of communication relevant to personal research interests.
2. Identify research questions on a contemporary issue in communication, and perform a critical, written analysis of the relevant literature.
3. Develop specific research questions related to personal research interests.
4. Identify an appropriate, empirical methodology (or media approach) to address the selected research problem.
5. Demonstrate mastery of the methodology and techniques specific to the field of study. Analyze and interpret research data.
6. Present and discuss, in written form, the findings and relevance of the research project to the field of communication and to broader society.
7. Present, discuss, and defend the findings and relevance of the research project to the field of communication in an oral defense.
2) Your program's SLOs are published as follows. Please update as needed.
Student Handbook. URL, if available online: http://communications.hawaii.edu/documents/com/pdf/Student_Handbook.pdf
Information Sheet, Flyer, or Brochure URL, if available online: NA
UHM Catalog. Page Number: 108
Course Syllabi. URL, if available online: NA
Other: new student orientation at the beginning of each academic year
Other: Communicationgraduate list/forum in Laulima/Graduate program posterboard in department
3) Please review, add, replace, or delete the existing curriculum map.
- File (03/16/2020)
4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.
5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?
No (skip to question 16)
6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
7) Briefly explain the assessment activities that took place in the last 18 months.
We developed a rubric to help us assess our MA program SLOs. Each student who defended their MA thesis (a requirement for graduation) provided evidence. The Graduate Chair used evidence from successful course completion for SLO1, and the remaining SLOs were assessed shortly after the final thesis defense, based on the document and oral defense. We later had informal discussions about the results and used these to adjust content in a few courses and also to strengthen our rationale for proposing a new course (that we were already considering).
8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)
Direct evidence of student learning (student work products)
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Indirect evidence of student learning
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Other 1: Successful completion of content area courses in the student's specialty area
Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.
9 (all were included)
10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)
Ad hoc faculty group
Persons or organization outside the university
Advisors (in student support services)
Students (graduate or undergraduate)
Other: Graduate Chair
11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other: for SLO1, Graduate Chair checked to see that students successfully completed coursework in specialization content area
12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.
We are pleased to report that we met our benchmark (80%) on all SLOs.
|Unacceptable||Marginal||Proficient||Exemplary||Total meeting benchmark|
13) What best describes how the program used the results? (Check all that apply.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
14) Please briefly describe how the program used the results.
First, we announced our success to our incoming COM MA cohort at our orientation session in August. (Students were actually quite interested in this and seemed pleased.)
Although we met the benchmark for all SLOs, and the only marginal work was limited to a single thesis, we have had informal conversations about what an ideal benchmark should be. For example, should we aim for more of our students scoring Exemplary rather than Proficient on some, or all, SLOs?
We have also had informal discussions about how to better bridge the gap between the research literature and finding an appropriate research question/problem. These discussions have led to some advising changes and also some smaller changes in individual courses to highlight best practices more clearly.
Finally, we proposed a new course that would focus on research proposal development/methods. Although we do not have a large gap in this area, an overlap here could strengthen our students' learning outcomes in several ways.
15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.
We have not had a formal meeting of our graduate faculty yet (results were in only a few weeks ago). However, the process and results have illuminated a few areas for further discussion that may yield notable improvements.
We were also pleased to confirm that our students are doing great work, and we now have specific data to support this when we share the news with others.