Unit: Communication and Info Sci (PhD, interdisciplinary)
Program: Communication & Info Sci (PhD)
Degree: Doctorate
Date: Thu Oct 02, 2014 - 10:05:40 am

1) Below are your program's student learning outcomes (SLOs). Please update as needed.

(SLO1) Demonstrate understanding of research methods and subject knowledge in the field of Communication and Information Sciences

(SLO2) Synthesize diverse data, theories, and methods

(SLO3) Demonstrate the ability to conduct research 

(SLO4) Propose and conduct original research

(SLO5) Develop and articulate a professional identity as a contributing member of a research community

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: http://www.hawaii.edu/cis/
Student Handbook. URL, if available online: http://www.hawaii.edu/cis/?page=policies
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other:
Other:

3) Select one option:

Curriculum Map File(s) from 2014:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Did your program engage in any program assessment activities between June 1, 2013 and September 30, 2014? (e.g., establishing/revising outcomes, aligning the curriculum to outcomes, collecting evidence, interpreting evidence, using results, revising the assessment plan, creating surveys or tests, etc.)

Yes
No (skip to question 14)

6) For the period between June 1, 2013 and September 30, 2014: State the assessment question(s) and/or assessment goals. Include the SLOs that were targeted, if applicable.

Last year, CIS implemented an assessment rubric for dissertation proposals and final dissertations adapted from models proposed by the Manoa Assessment Office and integrated into the CIS Policy document posted on our Website here: http://www.hawaii.edu/cis/documents/CIS_Policies_June_2014.pdf (see Appendix).  We evaluated the assessment instrument and its implementation in a CIS Town Hall meeting that included all students and 6 faculty. 

7) State the type(s) of evidence gathered to answer the assessment question and/or meet the assessment goals that were given in Question #6.

Data from the first three dissertation proposal assessments

Data gathered in the CIS Town Hall meeting, a focus-group-like setting where students and some faculty were invited to comment and discuss.

 

8) State how many persons submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Dissertation proposal assessment rubric: 15 faculty members (5 per committee, three students)

CIS Town Hall meeting: 26 students + 6 faculty + the program chair = 33

 

 

9) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

10) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

11) For the assessment question(s) and/or assessment goal(s) stated in Question #6:
Summarize the actual results.

Students and faculty agreed that the following changes should be made:

--Present the assessment rubric earlier.  The brief survey had been handed to dissertation committee members during the proposal/final defense, after the student had presented research and while the committee was deliberating on the defense outcome.  Both students and faculty felt that there was so much going on during this time that the assessment instrument was a distraction, and responses were not as well considered as they might have been.

--De-emphasize numerical ratings.  As an interdisciplinary PhD program that requires students to constitute their committees with faculty from diverse disciplines, there are understandable differences in committee members' standards for dissertation proposal elements.  Also, some faculty used the surveys to "send a message" to students at the proposal defense stage with a lower-than-warranted rating, to motivate them to improve particular areas. 

 

12) State how the program used the results or plans to use the results. Please be specific.

The CIS Chair agreed to propose to the CIS Executive Board in the Fall 2014 meeting that the assessment instrument be given to committee members when the student releases the document to them prior to the defense, and that rather than a numerical rating on a 1-4 scale, committee members should comment specifically on their assessment of each area of the dissertation proposal (problem statement, literature review, method, completion plan) or dissertation (problem statement, literature review, method, analysis, original contribution).

13) Beyond the results, were there additional conclusions or discoveries?
This can include insights about assessment procedures, teaching and learning, program aspects and so on.

The first seminar session of each semester is dedicated to a CIS Town Hall meeting, and this has proven to be an ideal place to get broad input on all program-related matters.

14) If the program did not engage in assessment activities, please explain.
Or, if the program did engage in assessment activities, please add any other important information here.