Program: Communication & Info Sci (PhD)
Degree: Doctorate
Date: Tue Nov 17, 2015 - 1:05:50 pm
1) Below are your program's student learning outcomes (SLOs). Please update as needed.
(SLO1) Demonstrate understanding of research methods and subject knowledge in the field of Communication and Information Sciences
(SLO2) Synthesize diverse data, theories, and methods
(SLO3) Demonstrate the ability to conduct research
(SLO4) Propose and conduct original research
(SLO5) Develop and articulate a professional identity as a contributing member of a research community
1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)
2) Your program's SLOs are published as follows. Please update as needed.
Student Handbook. URL, if available online: http://www.hawaii.edu/cis/?page=policies
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other:
Other:
3) Please review, add, replace, or delete the existing curriculum map.
4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.
1-50%
51-80%
81-99%
100%
5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?
6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:
7) Briefly explain the assessment activities that took place in the last 18 months.
8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)
Direct evidence of student learning (student work products)
Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Other 1:
Other 2:
Indirect evidence of student learning
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Other 1:
Other 2:
Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:
9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.
Dissertation proposal assessment rubric: 15 faculty members (5 per committee, three students)
CIS Town Hall meeting: 26 students + 6 faculty + the program chair = 33
10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:
11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:
12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.
Students and faculty agreed that the following changes should be made:
--Present the assessment rubric earlier. The brief survey had been handed to dissertation committee members during the proposal/final defense, after the student had presented research and while the committee was deliberating on the defense outcome. Both students and faculty felt that there was so much going on during this time that the assessment instrument was a distraction, and responses were not as well considered as they might have been.
--De-emphasize numerical ratings. As an interdisciplinary PhD program that requires students to constitute their committees with faculty from diverse disciplines, there are understandable differences in committee members' standards for dissertation proposal elements. Also, some faculty used the surveys to "send a message" to students at the proposal defense stage with a lower-than-warranted rating, to motivate them to improve particular areas.
13) What best describes how the program used the results? (Check all that apply.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:
14) Please briefly describe how the program used the results.
The CIS Chair agreed to propose to the CIS Executive Board in the Fall 2014 meeting that the assessment instrument be given to committee members when the student releases the document to them prior to the defense, and that rather than a numerical rating on a 1-4 scale, committee members should comment specifically on their assessment of each area of the dissertation proposal (problem statement, literature review, method, completion plan) or dissertation (problem statement, literature review, method, analysis, original contribution).
15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.
The first seminar session of each semester is dedicated to a CIS Town Hall meeting, and this has proven to be an ideal place to get broad input on all program-related matters.
16) If the program did not engage in assessment activities, please explain.
During his term from S2013-F2014, the CIS Director (Dr. Rich Gazan) led the program steering committee through extensive review and assessment of the program. The steering committee (including the current Director, Dr. Elizabeth Davidson) participated in the design of assessment actions, their implementation, and review/critique of their effectiveness. The Steering Committee (composed of the Department/School chairs of the four participating units) are satisfied with the current assessment structure. In S2015 the committee reviewed the status of assessment and concluded that the program should proceed with the current process as results are accumulated and examined, over the next year. This also supports the shift in program directors from Dr. Gazan to Dr. Davidson, as the program learns through its experiences with assessment where further work may be needed.
Both the former and current Directors are concerned that the .5FTE administrative support position, which became vacent in February 2014, has not been filled due to budget constraints. The lack of any administrative support for the conduct of the program will limit the degree to which assessment can be carried out effectively in future.