Unit: Natural Resources & Environmental Management
Program: Environmental Management (MEM)
Degree: Master's
Date: Thu Nov 19, 2020 - 4:29:32 pm

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. Students demonstrate knowledge of social and ecological principles, and interdisciplinary aspects of natural resource and environmental management issues

(1. Demonstrate comprehensive knowledge in one or more general subject areas related to, but not confined to, a specific area of interest.)

2. Students can analyze and address natural resource and environmental management problems by using appropriate methods from social and/or natural science disciplines

(2. Demonstrate understanding of research methodology and techniques specific to one’s field of study., 3. Apply research methodology and/or scholarly inquiry techniques specific to one’s field of study., 4. Critically analyze, synthesize, and utilize information and data related to one’s field of study.)

3. Students communicate effectively, both orally and in writing, to diverse audiences including professionals, resource managers, local communities and policy makers

(5. Proficiently communicate and disseminate information in a manner relevant to the field and intended audience.)

4. . Students can: a. Conduct scientific research of professional quality in their specialization area (M.S. Plan A) b. Conduct a capstone project of professional quality to acquire practical experience by applying NREM knowledge (M.S. Plan B, M.S. Plan C, MEM)

(3. Apply research methodology and/or scholarly inquiry techniques specific to one’s field of study., 4. Critically analyze, synthesize, and utilize information and data related to one’s field of study.)

5. Students can function as professionals in their specialization area by demonstrating responsible and ethical conduct, effective collaboration, informed decision making, and life-long learning

(6. Conduct research or projects as a responsible and ethical professional, including consideration of and respect for other cultural perspectives., 7. Interact professionally with others.)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: https://cms.ctahr.hawaii.edu/nrem
Student Handbook. URL, if available online: https://cms.ctahr.hawaii.edu/nrem/GRADUATE
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: Not available online; contact NREM directly for syllabi
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2020:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

No
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between November 1, 2018 and October 31, 2020?

Yes
No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period November 1, 2018 and October 31, 2020? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate other pressing issue related to student learning achievement for the program (explain in question 8)
Other:

8) Briefly explain the assessment activities that took place since November 2018.

We use information from the core courses and related products and presentations (600, 601, 695, and 696) to evaluate SLO achievement. These courses developed detailed rubrics, which we have as evidence. The department discussed changing curriculum (e.g., rules about how many NREM course credits; we update the list of concentration classes regularly; etc.). As a department, we have discussed altering the concentration areas to better track our faculty areas.

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Materials from all MEM students (n=39) were evaluated.

11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

13) Summarize the results from the evaluation, analysis, interpretation of evidence (checked in question 12). For example, report the percentage of students who achieved each SLO.

Materials from the core courses, including detailed rubrics, assess student performance across multiple categories. Student defenses of capstone proposals (n=24) and project reports (n=34) are similarly assessed via a rubric. All of these courses are designed to meet student SLOs. 

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

15) Please briefly describe how the program used its findings/results.

Instructors of the core courses creatd rubrics for evaluation. Additional courses were added to the curriculum. particularly related to quantitative skills. We tried to hire two additional social scientists (but the University froze the hires). We update our list of elective courses for students, and at one point, waived NREM credit requirements due to too few elective courses being offered.

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

The rubrics for the core classes have been very useful in keeping consistent expectations across instructors, and core courses.

17) If the program did not engage in assessment activities, please justify.

N/A