Unit: Communicology
Program: Communicology (MA)
Degree: Master's
Date: Thu Oct 01, 2015 - 2:35:59 pm

1) Below are your program's student learning outcomes (SLOs). Please update as needed.

  1. Demonstrate mastery of theories of communication, particularly in the areas of relational, persuasion/social influence, and message processing functions

  2. Demonstrate mastery of fundamentals of research design and analysis in communication
     
  3. Demonstrate an integrative and systematic understanding of the human communication process

1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: manoa.hawaii.edu/communicology
Student Handbook. URL, if available online: https://laulima.hawaii.edu/access/content/group/51af1472-6046-45d0-a17f-4b168b0b7118/Graduate%20Handbook%202012.pdf
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2015:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?

Yes
No (skip to question 16)

6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

7) Briefly explain the assessment activities that took place in the last 18 months.

In a meeting at the end of each Fall and Spring semester (usually 2 hours in length), the entire faculty assess how well each current and active graduate student is meeting the SLOs.  Those faculty who could not be present submit written notes on the students and those notes are read aloud and shared at the meeting.  We use a rubric of the SLOs and evaluate the extent to which the graduate student is exceptional, acceptable, developing, or unacceptable in meeting the SLOs.  Consensus is the primary mode of decision-making. 

 

8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)

Direct evidence of student learning (student work products)


Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Other 1:
Other 2:

Indirect evidence of student learning


Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Other 1:
Other 2:

Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)


Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

All faculty are present at the assessment meeting and we evaluate all current and active graduate students.  Those who could not be present submit written notes on the students and those notes are read aloud and shared at the meeting.

10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other: Graduate Chair

11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.

Fall 2014 End of Semester Student Performance Evaluation Results

Results for End of Fall 2014 Semester

Sample (N=12): first semester (n=5), second semester (n=0), third semester (n=7), fourth semester (n=0)

SLO

Unacceptable

Developing

Acceptable

Exceptional

N/A or Unable to Judge

SLO1: Theory

0

8

4

0

0

SLO2: Research

0

5

5

2

0

SLO3: Understanding

1

7

3

1

0

SLO4: Ph.D. Preparation

1

5

2

0

4

SLO5: Conduct Research

0

5

5

2

0

SLO6: Oral Articulation

2

6

3

1

0

SLO7: Written Articulation

1

8

2

1

0

SLO8: Presentation/Teaching

1

4

5

2

0

TOTAL

6 (7%)

48 (40%)

29 (30%)

9 (9%)

4 (4%)

Note:  The benchmark for the SLOs upon which MA students are evaluated is an exiting or graduating MA student.  The benchmark does not change for first semester versus second semester, etc.

 

 

Spring 2015 End of Semester Student Performance Evaluation Results

Results for End of Spring 2015 Semester

Sample (N=13): first semester (n=0), second semester (n=5), third semester (n=0), fourth semester (n=7), fifth                       semester (n=1)

SLO

Unacceptable

Developing

Acceptable

Exceptional

N/A or Unable to Judge

SLO1: Theory

0

5

4

4

0

SLO2: Research

1

2

5

5

0

SLO3: Understanding

0

3

8

2

0

SLO4: Ph.D. Preparation

0

3

3

2

5

SLO5: Conduct Research

1

3

4

5

0

SLO6: Oral Articulation

1

5

5

2

0

SLO7: Written Articulation

0

8

3

2

0

SLO8: Presentation/Teaching

2

4

5

2

0

TOTAL

5 (5%)

33 (32%)

37 (36%)

24 (23%)

5 (5%)

Note:  The benchmark for the SLOs upon which MA students are evaluated is an exiting or graduating MA student.  The benchmark does not change for first semester versus second semester, etc.

 

13) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

14) Please briefly describe how the program used the results.

We plan to use the results to see if graduating students generally meet our SLOs and if not, then we will investigate how our teaching and implementation of our required classes and the culminating thesis or applied project can be modified.  Given that our MA program is relatively small, we need to accumulate several years of data before we will be able to see any patterns that may emerge.  We expect that by Spring 2016, we should be able to judge more concretely if our students are meeting our SLOs and make appropriate changes to our program if they are not meeting them.

In the meantime, the graduate chair writes individual letters to each active graduate student to apprise them of the faculty's general assessment of whether they are meeting expectations with regard to our SLOs.  The graduate chair also shares, based on the faculty assessment meeting of the graduate program, a strength and a weakness of the student.  We have discovered that this feedback has been very effective in creating positive change in our program and helps students better meet our SLOs.

15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

We have seen that our graduating students tend to meet the program level SLOs.

Students who were deficient in meeting the SLOs tended to drop out of the program.

Beyond these results, we have undertaken to assess the factors that might be especially critical when making graduate student admission decisions.  We examined whether an applicant’s undergraduate GPA, scores on the verbal reasoning portion of the GRE, scores on the quantitative reasoning portion of the GRE, and/or scores on the analytical writing portion of the GRE predicted a graduate student’s GPA upon graduation from our program.  We used GPA upon graduation from our program as another proxy for success.  We analyzed data from 2005 to 2015.  The results showed that the model with undergraduate GPA and the individual components of the GRE was statistically significant in predicting GPA upon graduating from our MA program in Communicology, F(4, 38) = 3.71, p = .01, adjusted R2 = .28.  Specifically, scores on the quantitative reasoning portion of the GRE was the only significant predictor of GPA upon graduation from our MA program in Communicology, β = .43, t = 2.86, p = .01.  The model also indicated that there was minimal collinearity among the four predictors.  Tolerance of the predictors ranged from .69 to .85 and the Variance Inflation Factor (VIF) ranged from 1.18 to 1.46.

We used this information to provide guidance to prospective graduate students and to help with admission decisions in borderline cases.

Regression Analysis of Predictors of GPA Upon Graduation from MA Program in Communicology

         Predictor

 b

   SE

 β

t

Undergrad GPA

    0.85

       .08

       .15

      1.03

Verbal Reasoning GRE

   -0.01

       .01

      -.04

      -0.21

Quantitative Reasoning GRE

    0.01

       .01

       .43

      2.86**

Analytical Writing GRE

    0.28

       .04

      .13

      0.80

** p < .01

Based on our past assessments, we have now begun to hold faculty meetings on all of the graduate students at the end of each semester, rather than at the end of only the fall semesters.

We send individual letters to each student to let them know how well they are meeting our program level SLOs.

We have also developed SLOs for our graduate courses and have placed them on the syllabi.

We also now discuss the program level SLOs at our orientation session for new graduate students.  This occurs during the week before the Fall semester begins.

16) If the program did not engage in assessment activities, please explain.

not applicable