Unit: Life Sciences
Program: Botany (BS)
Degree: Bachelor's
Date: Thu Nov 15, 2018 - 10:16:59 pm

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. Students can recall details and information about the evolution, anatomy, morphology, systematics, genetics, physiology, ecology, and conservation of plants, algae, and fungi.

(1b. Specialized study in an academic field, 3c. Stewardship of the natural environment)

2. Students can recall details of the unique ecological and evolutionary features of the Hawaiian flora.

(1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history)

3. Students can communicate effectively using oral and written communication skills.

(1a. General education, 2c. Communicate and report)

4. Students can generate and test hypotheses, make observations, collect data,analyze and interpret results, derive conclusions, and evaluate their significance within a broad scientific context.

(1b. Specialized study in an academic field, 2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report)

2) Your program's SLOs are published as follows. Please update asneeded.

Department Website URL: http://www.botany.hawaii.edu/undergraduate/
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2018:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

No
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between June 1, 2015 and October 31, 2018?

Yes
No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period June 1, 2015 to October 31, 2018? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
No (skip to question 17)
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

8) Briefly explain the assessment activities that took place.

We administer a multiple choice exam to assess SLOs 1 and 2. It is administered once for our freshman seminar
(Botany 100), as a "pre" test to assess student knowledge as new Botany majors, and again in our senior seminar
(Botany 400), as a "post" test to assess their knowledge as they exit the program. The exams are anonymous. Thus,
the results records the knowledge of the freshman cohort group entering the Botany program and their knowledge as
seniors when they leave our program.

While assessment exams were given starting in Fall 2014 for Botany 100, I have been unable to locate those exams
and will not be able to offer results for previous years. I have, however, contacted the instructor for the course, this semester (Fall 2018) and have given him the exam to give to the class at the end of the semester. I do have the results for students taking the exam in Botany 400, for 2017-18. However, without the exam results for Botany 100, from Fall 2014 onward, I will be unable to make an assessment for students exiting the program for 2017-18.

An assessment comparing entering freshmen and exiting seniors will now not be possible until 2021 since we will have to start anew with the current freshmen class of 2018.

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Five BS students took the exam in 2018. One of the five actually was in the Ethnobotany degree program. However, because we are no longer admitting students into our undergraduate Ethnobotany program, I have added that student with the regular BS botany students. Since the SLOs are the same, I hope that this is acceptable.

With the small number of students being evaluated, any samplying
techniques would not be that meaningful. Overtime, as the numbers of students taking the exam accumulates, an
assessment can then be made. However, it is uncertain as to when that will be since it is not possible to predict the
number of students entering the program and those who will graduate.

11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other: I am a faculty member in the Botany Department that has been assigned to submit the Botany SLA.

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

13) Summarize the results of the assessment activities checked in question 7. For example, report the percentage of students who achieved each SLO.

I do not have results for Botany 100 (Please see question # 8). Of the five BS students that took the exam in Botany 400, in 2018,  their scores were 66%, 72%, 76%, 82% and 86%. With 70% and above as the score for achieving the SLOs, four of the five achieved the SLOs goals and one (66%) failed to do so.

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

15) Please briefly describe how the program used the results.

Nothing to report at this time.

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

No, not yet.

17) If the program did not engage in assessment activities, please justify.