Program: Nutrition (PhD)
Degree: Doctorate
Date: Wed Mar 16, 2016 - 2:04:52 pm
1) Below are your program's student learning outcomes (SLOs). Please update as needed.
Based on content in the UHM course catalog:
The PhD program in Nutrition is designed to prepare future leaders and innovators who can expand our knowledge about food and health, solve nutrition-related problems, propose effective nutrition policies, guide new product and service development, and be effective researchers, communicators and educators. To ensure that graduates are prepared for these roles, students will be expected to demonstrate:
- Comprehensive understanding of core nutrition knowledge;
- Advanced scholarship in a specialty area (i.e., expertise in at least one overlapping biomedical discipline e.g., biochemistry, physiology, cell and molecular biology, food science/functional foods, epidemiology, biostatistics, medicine, etc.);
- Appropriate exposure to social and career-building disciplines (e.g., education, communications, information technology, technical writing, social sciences, etc.);
- Ability to conduct original scholarly research, develop skills in research methodologies and grant writing, understand research ethics, and effectively dissemination research findings via peer-reviewed publications, seminars, and practical applications such as teaching.
Based on simpllified content on the PhD Nutrition Program website:
Students in the Intercollege Nutrition PhD Program will be expected to demonstrate:
- Comprehensive understanding of core nutrition knowledge
- Advanced scholarship in a specialty area
- Appropriate exposure to social and career-building disciplines
- Develop skills in research methodologies demonstrated by conducting original scholarly research
- Develop skills in grant writing
- Understanding of research ethics
- Effectively disseminate research findings via peer-reviewed publications, seminars and practical applications such as teaching
1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)
2) Your program's SLOs are published as follows. Please update as needed.
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox.png)
![](images/checkbox_empty.png)
3) Please review, add, replace, or delete the existing curriculum map.
- File (03/16/2020)
4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.
![](images/radio_empty.png)
![](images/radio_empty.png)
![](images/radio_empty.png)
![](images/radio_empty.png)
![](images/radio.png)
5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?
![](images/radio.png)
![](images/radio_empty.png)
6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)
![](images/checkbox.png)
![](images/checkbox.png)
![](images/checkbox.png)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
7) Briefly explain the assessment activities that took place in the last 18 months.
- Evaluated the SLOs met using examination rubrics (Qualifying Exam, Comprehensive and Dissertation Proposal, and Dissertation Defense)
- In process of revising the curriculum map based on last year's assessment report feedback to be in IRMA format as well as to ensure curriculum aligns with SLO
- Held faculty meeting discussing curriculum map revisions (September 15, 2015)
- Students completed end of semester evaluations (Fall 2014 and Spring 2015)
- End of semester teaching evaluations completed (Fall 2014 and Spring 2015)
8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)
Direct evidence of student learning (student work products)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox.png)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
Indirect evidence of student learning
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.
Enrollment in the program is small (2014-2015 academic year we had 6 students enrolled)
- 5 students completed the end of semester evaluation in both Fall 2014 and Spring 2015
- Survey emailed to all students but not requirement for all to complete
- 1 student completed the alumni survey
- Survey emailed to all alumni (3 totalt at that time) in summer 2015 and only 1 response received
- 2 end of semester teaching evaluations were completed by advisors for 2 students
- Survey emailed to teaching advisors for the 2 students enrolled in a teaching experience during that time period with both advsiors completing
- 3 Dissertation Defense rubrics completed for 3 students
- Completed by the committee at the time of the defense
- 2 Qualifying Examination rubrics completed for 1 student
- Completed by the committee at the time of the examination
- The student had to repeat the exam due to poor performance the first time
10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox.png)
11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.
Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
- Curriculum map is currently under revision - had one initial discussion with faculty but revision needs to be confirmed
- Revision of the curriculum informed based on feedback from last assessment report as well as the poor performance of one student on the qualifying examination which instigated the need to re-examine the core curriculum which led to the investigation of all SLOs having an appropriate assessment point
- Rubrics modified to note which specific examination domains are associated with an SLO
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
-
Surveys distributed to students via email at the end of the fall and spring semester last year. 4 students completed in the fall semester and 5 in the spring semester.
- Career building SLO (#3) - survey results from across both semesters (n=9) for the question "Please rank your career building skills" selected career building is at least good (n=1) or very good (n=8)
- Grant writing SLO (#5) - survey results from across both semesters (n=9) for the question "Please rank your grant writing skills" selected grant writing skills are fair (n=1), good (n=5), very good (n=2), and excellet (n=1)
-
Surveys distributed to teaching advisor for 2 students teaching experience
- Results don't clearly demonstrate how well student is doing in regard to teaching. Noted from feedback that one student gave lessons while the second student did not. Therefore, developed a teaching experience rubric to more adequately address the SLO related to dissemination (#7)
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
- Results of curriculum map revision activities informing the revision of requirements to include specific classes in the list of required classes (e.g. specific grant writing class and a specific career building skills class) as well as to modify the dissertation proposal format to be a mock grant as an assessment point for SLO #5 on grant writing.
- Will need to develop programmatic benchmarks for meeting SLOs based on rubrics
13) What best describes how the program used the results? (Check all that apply.)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox_empty.png)
![](images/checkbox.png)
![](images/checkbox_empty.png)
14) Please briefly describe how the program used the results.
- Revise curriculum map
- Make recommendations on required coursework and modification for dissertation proposal format to align with mock grant proposal
- Creation of rubric for assessing teaching experience
- Modification of rubric to assess appropriate SLO
- Reinvigorated collaboration and planning with MS Nutritional Sciences Program on curricular changes.
15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.
Opened up discussion among faculty for other opportunities for alignment with other courses.
Assessment will contribute to a stronger program.
Strengthened momentum for curricular changes.
Will also share with current students to gather feedback on proposed changes.
Opportunity for the program Assessment Coordinator to attend the 3-d Assessment training in August was also very helpful to this process (Mahalo!)
16) If the program did not engage in assessment activities, please explain.
Not applicable