Unit: Kinesiology & Rehabilitation Science
Program: Kinesiology & Rehab Sci (BS)
Degree: Bachelor's
Date: Wed Oct 07, 2015 - 2:26:57 pm

1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)

1. Students will demonstrate knowledge of anatomical, physiological, biomechanical, and psychological principles of how the body moves in relation to space, time, and distance.

(1a. General education, 1b. Specialized study in an academic field)

2. Students will demonstrate knowledge in the application of movement principles and concepts related to movement.

(1a. General education, 1b. Specialized study in an academic field, 2a. Think critically and creatively, 3a. Continuous learning and personal growth)

3. Students will demonstrate and communicate the ability to coordinate, plan, manage, and facilitate exercise prescription and information

(1a. General education, 1b. Specialized study in an academic field, 2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report, 3a. Continuous learning and personal growth)

4. Students will demonstrate application of programming for healthy lifestyles through application and research venues.

(1a. General education, 2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report)

5. Students will demonstrate civic responsibility through a service learning project [capstone experience]

(3d. Civic participation)

6. Students will demonstrate pro-social skills and professional dispositions in human interaction especially for persons of color and Native Hawaiians.

(1c. Understand Hawaiian culture and history, 3b. Respect for people and cultures, in particular Hawaiian culture, 3c. Stewardship of the natural environment)

7. Students will be able to demonstrate culturally responsive teaching and interaction with persons of color and Native Hawaiians.

(1c. Understand Hawaiian culture and history, 3b. Respect for people and cultures, in particular Hawaiian culture)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: https://coe.hawaii.edu/academics/kinesiology-rehabilitation-science/bs-program
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other: Departmental brochure
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2015:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?

Yes
No (skip to question 16)

6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

7) Briefly explain the assessment activities that took place in the last 18 months.

Assessment questions have been presented, evidence for each, along with a description of each direct evidence.

Assessment Question # 1- Knowledge- What should the students know and be able to do by the completion of the program?  

1.KRS 353- Structural Anatomy Rubric- Students must demonstrate knowledge of structural anatomy and are rated on their overall knowledge of gross anatomy through major exams and quizzes and on their anatomical knowledge for structural anatomy through identification of anatomical structures. Students receive a cumulative score for all exams and quizzes and are rated unacceptable, acceptable, or target on the following elements: knowledge of gross human anatomy; ability to identify anatomical structures; and ability to apply knowledge.  For structural anatomy, students are asked to identify anatomical structures and are rated on the following scale: 1 = unacceptable; 2 = acceptable; and 3 = Target.   

2.KRS 354L Rubric- Students must demonstrate knowledge of measurement techniques for the following: cardiovascular responses to graded exercise; skeletal muscle motor recruitment and responses to load and fatigue; relationship between static and dynamic strength, absolute and relative strength; metabolic pathways used to develop anaerobic power and blood lactate; metabolic pathways used to develop energy aerobically; and understanding of body composition. Students are rated on the following scale: 1 = unacceptable; 2 = acceptable; and 3 = Target.   

3.KRS 463- Biomechanics Rubric- Students must demonstrate knowledge of biomechanics and are rated on their overall knowledge of biomechanics through major exams and quizzes and on their direct application of biomechanical knowledge. Students receive a cumulative score for all exams and quizzes and are rated unacceptable, acceptable, or target on the following elements: combination of human anatomy and physics; application of mathematical skills to motor activity; and association of biomechanical principles along with analysis of fundamental activities. For application of biomechanical knowledge, students are asked to directly apply biomechanical principles to sporting activities and are rated on the following scale: 1 = unacceptable; 2 = acceptable; and 3 = Target.   

Indirect Evidence includes the collection of program completer surveys that are distributed by the Dean’s Office to candidates in their final semester. These data are published in reports aggregated by program in the College of Education Intranet and are also reported on the College of Education website “Measuring our Success.”

Data are reported for AY 2014/2015 (Fall 2014, Spring 2015, Summer 2015) in Question # 11.

8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)

Direct evidence of student learning (student work products)


Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Other 1:
Other 2:

Indirect evidence of student learning


Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Other 1:
Other 2:

Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)


Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Below is a table indicating: assessment question; direct evidence used for each assessment question, and number of students who submitted direct evidence that were evaluated. A convenience sample was used for all direct evidence.

Direct Evidence

N = 512

 

Fall 2014

Spring 2015

Summer 2015

Assessment Question # 1- Knowledge- What should the students know and be able to do by the completion of the program?  

 

1.KRS 353 – Structural Anatomy Rubric

n = 41

n = 75

n =  33

2.KRS 354L Rubric

n = 47

n = 48

n = 18

3.KRS 463 Biomechanics Rubric

n = 36

n = 35

n = 22

Assessment Question # 2- Skills- What skills should the students be able to demonstrate and apply to a real world setting?      

 

1.Entry Level Volunteer Evaluation Rubric  

n = 20

n = 39

n = 19

Assessment Question # 3- Dispositions- How well do our students demonstrate professional dispositions?     

 

1.KRS 488- Practicum Evaluation Form 

n = 18

n = 40

n = 21

2.KRS- 488- Professional Dispositions Rubric                                                                                                                                                                                                                                                                                                

n = 18

n = 40

n = 21

No sampling technique was used as all students enrolled in the classes were assessed.

10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.

Data Table

AY 2014-2015

Scoring rubrics for direct evidence in the identified classes have been aggregated and presented below.

Assessment Questions, ILOs and SLOs

Fall 2014

Spring 2015

Summer 2015

Assessment Question # 1

N = 124

N = 158

N = 73

Direct Evidence: KRS 353- Structural Anatomy Rubric; KRS 354L Rubric; KRS 463- Biomechanics Rubric

U

A

T

U

A

T

U

A

T

%

4.03%

92.80%

3.23%

4.43%

91.80%

3.80%

4.11%

90.41%

5.48%

ILO # 1

 

 

 

 

 

 

 

 

 

   1a. General Education

 

 

 

 

 

 

 

 

 

1a1

5

115

4

7

145

6

3

66

4

1a2

5

115

4

7

145

6

3

66

4

1a3

5

115

4

7

145

6

3

66

4

   1b. Specialized Study in an Academic Field

 

 

 

 

 

 

 

 

 

1b1

5

115

4

7

145

6

3

66

4

1b2

5

115

4

7

145

6

3

66

4

1b2

5

115

4

7

145

6

3

66

4

   1c. Understand Hawaiian Culture and History

 

 

 

 

 

 

 

 

 

1c6

5

115

4

7

145

6

3

66

4

1c7

5

115

4

7

145

6

3

66

4

Assessment Question # 2

N = 20

N = 39

N = 19

Direct Evidence: Entry Level Volunteer Evaluation Rubric

U

A

T

U

A

T

U

A

T

%

 

85%

15%

 

89.74%

10.26%

 

89.48%

10.53%

ILO # 2

 

 

 

 

 

 

 

 

 

   2a. Think Critically and Creatively

 

 

 

 

 

 

 

 

 

2a2

0

17

3

0

35

4

0

17

2

2a3

0

17

3

0

35

4

0

17

2

2a4

0

17

3

0

35

4

0

17

2

   2b. Conduct Research

 

 

 

 

 

 

 

 

 

2b3

0

17

3

0

35

4

0

17

2

2b4

0

17

3

0

35

4

0

17

2

   2c. Communicate and Report

 

 

 

 

 

 

 

 

 

2c3

0

17

3

0

35

4

0

17

2

2c4

0

17

3

0

35

4

0

17

2

Assessment Question # 3

N = 18

N = 40

N = 21

Direct Evidence: KRS 488- Practicum Evaluation Form; Professional Dispositions Rubric

U

A

T

U

A

T

U

A

T

%

5.56%

88.89%

5.56%

2.5%

92.50%

5.0%

 

90.48%

9.52%

ILO # 3

 

 

 

 

 

 

 

 

 

    3a. Continuous Learning and Personal Growth

 

 

 

 

 

 

 

 

 

3a2

1

16

1

1

37

2

 

19

2

3a3

1

16

1

1

37

2

 

19

2

    3b. Respect for People and Cultures, in Particular Hawaiian Culture

 

 

 

 

 

 

 

 

 

3b6

1

16

1

1

37

2

 

19

2

3b7

1

16

1

1

37

2

 

19

2

    3c. Stewardship of the Natural Environment

 

 

 

 

 

 

 

 

 

3c6

1

16

1

1

37

2

 

19

2

    3d. Civic Participation in Their Communities

 

 

 

 

 

 

 

 

 

3d5

1

16

1

1

37

2

 

19

2

     U = Unacceptable; A = Acceptable; T = Target; The N represents the total number of students enrolled in the identified classes that the direct evidence was administered and collected in.

13) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

14) Please briefly describe how the program used the results.

Faculty in the program consistently use results of direct evidence to make improvements to individual classes and to enhance learning experiences of students.

Given the total number of students, the amount of students achieving an “unacceptable” level is within a bell curve and very low. However, direct evidence has been looked at carefully to see whether it was directly related to performance of students or how content was administered in the class.

In several classes  (KRS 488- Fall and Spring) students achieving an unacceptable level, were those who received” Incompletes” for a course grade and did not complete the course requirements for the “Incomplete grade” by the due date. When course requirements are not completed by due dates “Incomplete” grades automatically revert to a failing grade.

Additional changes include the following:

1.Assignments have been revised to make rubrics more clear

2.Examples of assignment are posted as a visual for students to follow along with guidelines

3.Course syllabi are reviewed twice

4.Syllabus quiz is given to reinforce guidelines

5.Additional time to review assignment guidelines have been implemented into class time

6.Mandatory drafts of assignments that are graded have been added to courses to allow students additional time to work on and apply material. All drafts are graded.

15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

Insights gained were that options for additional review along with examples are more helpful for students. Also giving class time for students to work on assignments is beneficial. Students are more likely to work on the assignments and ask the course instructor questions if class time is given and are less inclined to seek outside help during office hours.

16) If the program did not engage in assessment activities, please explain.

No applicable- program did engage in assessment activities.