Unit: Communication and Info Sci (PhD, interdisciplinary)
Program: Communication & Info Sci (PhD)
Degree: Doctorate
Date: Tue Oct 09, 2012 - 11:16:41 am

1) Below are your program's student learning outcomes (SLOs). Please update as needed.

The student is expected to spend two to three years, depending on the student’s background,

(A) obtaining comprehensive mastery of the methods and substance in the field of Communication and Information Sciences;

(B) developing the ability to productively synthesize diverse data, theories, and methods; and

(C) demonstrating the ability to conduct research prior to proposing a dissertation study. 

The student then focuses on

(D) proposing and conducting original research in his or her area, and

(E) writing and defending a dissertation on that research. 

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: http://www.hawaii.edu/cis/?page=policies
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:

3) Select one option:

Curriculum Map File(s) from 2012:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.


5) Did your program engage in any program assessment activities between June 1, 2011 and September 30, 2012? (e.g., establishing/revising outcomes, aligning the curriculum to outcomes, collecting evidence, interpreting evidence, using results, revising the assessment plan, creating surveys or tests, etc.)

No (skip to question 14)

6) For the period June 1, 2011 to September 30, 2012: State the assessment question(s) and/or assessment goals. Include the SLOs that were targeted, if applicable.

Since 2008, based upon issues identified during assessment activities each semester, we have adjusted the focus area examination system, added two required courses and the requirement that a research paper be published before a proposal defense may be attempted. Those requirements have been phased-in with the full impact relevant for students admitted in 2011. We are monitoring student progress to evaluate whether the structural changes are producing the desired results.

If the reforms are successful, students will progress through the exam stage more quickly and will attempt collaborative and individual research earlier in their academic careers. They will also arrive at the proposal stage with a better understanding of the appropriate research methods to be applied.

7) State the type(s) of evidence gathered to answer the assessment question and/or meet the assessment goals that were given in Question #6.

Each semester, every PhD student prepares a self-report that lists academic progress. These self-reports are complied, along with the results of exams attempted and publication attempts. The CIS Executive Committee meets to evaluate each student individually and their progress compared with others who entered at the same time. The CIS Program Chair reports the results of the assessment to every student, informing them if they are in good standing and meeting with students who need guidane or support to maintain satisfactory progress.

8) State how many persons submitted evidence that was evaluated. If applicable, please include the sampling technique used.

All 33 PhD students submit self-reports. The four faculty members of the Executive Committee evaluate the reports. The Program Chair serves as mentor to all newly admitted students and continues in contact as they locate research mentors and/or dissertation chairs. The seven focus areas have exam committees of three faculty members each. There is significant overlap among the various roles so an exact estimate of persons is both difficult to determine and an inaccurate gauge of involvement.

9) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Other: Program Coordinator and Executive Committee

10) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)

11) For the assessment question(s) and/or assessment goal(s) stated in Question #6:
Summarize the actual results.

We are continuing to monitor and evaluate student progress in resonse to program changes but do not have conclusions as yet.

12) State how the program used the results or plans to use the results. Please be specific.

The program will use the results to modify the Policies and Procedures document that sets markers for student progress through the program. In the past, modifications have changed the number of exams, the type of courses and the time allowed for completing both. 

13) Beyond the results, were there additional conclusions or discoveries?
This can include insights about assessment procedures, teaching and learning, program aspects and so on.

We are now using information about both the types of students who excell and those who encounter difficulty to look for red flags in the admissions process. For example, students who attempt this program while employed full time frequently encounter difficulties in meeting the timelines, regardless of the type of employment. We use this evidence to advise applicants who wish to make similar attempts.

14) If the program did not engage in assessment activities, please explain.
Or, if the program did engage in assessment activities, please add any other important information here.

We have identified unique assessment projects for each of the past four years. The results of past assessments yielded program changes that successfully moved slow-to-progress students through or out of the CIS Program. As a result, roughly two-thirds of our current students are pre-ABD, and we are assessing their exam success, publishable papers and time to proposal defense as key indicators of the effectiveness of our revised program policies.

We now wish to articulate more formally program-level assessment. The end-of-semester review of student progress is also a review of how the program is functioning: where it is doing well, where we have gaps, where we need resources. Our goal for the coming year is to fold in the previous assessment projects, the regular student progress reports, and the annual report to Graduate Division to evaluate the overall health of the program systematically.