Unit: Institute for Teacher Education
Program: Elementary Education (BEd)
Degree: Bachelor's
Date: Thu Nov 15, 2018 - 1:57:41 pm

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. InTASC Standard 1. Learner Development The teacher understands how learners grow and develop, recognizing that patterns of learning and development vary individually within and across the cognitive, linguistic, social, emotional, and physical areas, and designs and implements developmentally appropriate and challenging learning experiences.

(1b. Specialized study in an academic field, 3b. Respect for people and cultures, in particular Hawaiian culture)

2. InTASC Standard 2. Learner Differences The teacher uses understanding of individual differences and diverse cultures and communities to ensure inclusive learning environments that enable each learner to meet high standards.

(1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history, 3b. Respect for people and cultures, in particular Hawaiian culture)

3. InTASC Standard 3. Learning Environments The teacher works with others to create environments that support individual and collaborative learning, and that encourage positive social interaction, active engagement in learning, and self-motivation.

(1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history, 2c. Communicate and report, 3b. Respect for people and cultures, in particular Hawaiian culture)

4. InTASC Standard 4. Content Knowledge The teacher understands the central concepts, tools of inquiry, and structures of the discipline(s) he or she teaches and creates learning experiences that make the discipline accessible and meaningful for learners to assure mastery of the content.

(1a. General education, 1c. Understand Hawaiian culture and history)

5. InTASC Standard 5. Application of Content The teacher understands how to connect concepts and use differing perspectives to engage learners in critical thinking, creativity, and collaborative problem solving related to authentic local and global issues.

(1b. Specialized study in an academic field, 2a. Think critically and creatively)

6. InTASC Standard 6. Assessment The teacher understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teachers and learners decision making.

(1b. Specialized study in an academic field, 2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report)

7. InTASC Standard 7. Planning for Instruction The teacher plans instruction that supports every student in meeting rigorous learning goals by drawing upon knowledge of content areas, curriculum, cross-disciplinary skills, and pedagogy, as well as knowledge of learners and the community context.

(1b. Specialized study in an academic field)

8. InTASC Standard 8. Instructional Strategies The teacher understands and uses a variety of instructional strategies to encourage learners to develop deep understanding of content areas and their connections, and to build skills to apply knowledge in meaningful ways.

(1b. Specialized study in an academic field)

9. InTASC Standard 9. Professional Learning and Ethical Practice The teacher engages in ongoing professional learning and uses evidence to continually evaluate his/her practice, particularly the effects of his/her choices and actions on others (learners, families, other professionals, and the community), and adapts practice to meet the needs of each learner.

(2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report, 3a. Continuous learning and personal growth)

10. InTASC Standard 10. Leadership and Collaboration The teacher seeks appropriate leadership roles and opportunities to take responsibility for student learning, to collaborate with learners, families, colleagues, other school professionals, and community members to ensure learner growth, and to advance the profession.

(2c. Communicate and report, 3a. Continuous learning and personal growth, 3d. Civic participation)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: https://coe.hawaii.edu/students/ite-elementary/program-guidebooks/bed-elementary-education
Student Handbook. URL, if available online: https://docs.google.com/document/d/12XfD_a6v3PYAosx_n6d2wbbLAw4kCmqnIDDjz1Qk1Wg/edit
Information Sheet, Flyer, or Brochure URL, if available online: https://coe.hawaii.edu/academics/institute-teacher-education/bed
UHM Catalog. Page Number: http://www.catalog.hawaii.edu/schoolscolleges/education/ite.htm
Course Syllabi. URL, if available online: https://coe.hawaii.edu/intranet/departments-centers/institute-teacher-education-ite (on College of Education intranet, requires login)
Other:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2018:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

No
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between June 1, 2015 and October 31, 2018?

Yes
No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period June 1, 2015 to October 31, 2018? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
No (skip to question 17)
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

8) Briefly explain the assessment activities that took place.

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)

The COE licensure programs engaged in a collaborative effort to develop shared key assessments to assess student learning across all licensure programs. In 2015-16, a cross program committee worked on developing the four shared assessments. In 2016-17, the newly developed assessments were piloted by programs. In 2017-18, the shared assessments were fully implemented in all COE licensure programs.

Collect/evaluate student work/performance to determine SLO achievement:

Program faculty implemented the shared key assessments with all students in our program beginning in Fall 2017.

1. Assessment A – Planning Instruction: Candidates must demonstrate their ability to plan instruction for P12 learners. Evidence for this assessment is a minimum of three lesson plans, which are scored on Domain 1 (Planning and Preparation) of the Charlotte Danielson Framework for Teaching (CDF). The assessment is completed prior to the student teaching semester.

2. Assessment B - Student Teaching Evaluation: Candidates demonstrate their competence as a teacher candidate in the four domains of the Charlotte Danielson Framework for Teaching (CDF): (1) Planning and Preparation, (2) the Classroom Environment, (3) Instruction, and (4) Professional Responsibilities. This assessment is completed during student teaching and is cumulative across the entire semester of work.

3. Assessment C – Effect on P12 Learning: Candidates demonstrate their ability to plan, teach, and assess a unit of instruction/sequence of lessons. This assessment specifically addresses candidates’ ability to plan and teach a unit of instruction/sequence of lessons, analyze student learning through assessment data, and reflect on their teaching practice to improve their instruction. The assessment is scored on designated components and elements of the Charlotte Danielson Framework for Teaching (CDF). This assessment is completed during student teaching.

4. Assessment D - Professional Dispositions:  Candidates must demonstrate professional dispositions, including professionalism, communication (verbal and nonverbal), collaboration, reflection, and diversity. This assessment is completed in all field and student teaching experiences.

Collect/analyze student self-reports of SLO achievement via surveys:

Each semester, program completer surveys are distributed by the Dean’s Office to our candidates in their final semester of the program. Program faculty encourage their graduating students to complete the survey. This data is published in reports aggregated by program in the COE Intranet and is also reported on the COE public website, “Measuring Our Success.” Each fall (almost all of our students graduate in the spring), faculty review and discuss the results and implications of the survey data.

Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)

Based on the assessment results from the key assessments and student self-report data, the course ITE 320: Instructional and Assessment Methods for Multilingual Learners (3) was added as a requirement for all teacher candidates to help our students to achieve SLO 2: Learning Differences, specifically 2(e) The teacher incorporates tools of language development into planning and instruction, including strategies for making content accessible to English language learners and for evaluating and supporting their development of English proficiency.

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Direct Evidence based on Common Assessments

 

AY 15-16

AY 16-17

AY 17-18 

Assessment A 

Planning

Not required, no data 

Not required, no data 

57

Assessment B

Student Teaching Evaluation 

113 (previous version of assessment completed) 

21 piloted new version of assessment 

38 completed previous version 

49

Assessment C Candidate Effect on P-12 Learning

Not required, no data 

Not required, no data 

49

Professional Dispositions 

342 (previous version of assessment completed) 

158 piloted new version of assessment

95 completed previous version 

215

For the direct evidence, we engaged in a process of developing, piloting, and implementing the common assessments (Assessements A-D) across all COE licensure programs. During the 2015-2016 year, some of the assessment completed by the students were the same or similiar to the common assessments. During the 2016-2017 year, there was some piloting of the new assessments while some students completed the previous versions of assessments. During the 2017-2018 year, we engaged in full implementation and faculty entered data for all four common assessments. By 2017-2018, students were required to complete all four common assessments, therefore all available data for the students for each assessment were included in the sample.

Indirect Evidence

 

AY 15-16

AY 16-17

AY 17-18 

Program Completer Survey—Student Teachers 

71

67

52

Mentor Teacher Survey  

85

74

59

The student teachers who are our program completers are sent the survey we request but do not require that they complete and submit the survey. The same is true for mentor teachers.

11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other: Director of assessment (compiled survey results)

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

13) Summarize the results of the assessment activities checked in question 7. For example, report the percentage of students who achieved each SLO.

 

 

AY 15-16

AY 16-17

AY 17-18

Assessment A Planning

No data available

No data available

98%

Assessment B Student Teaching Evaluation

99% on previous version of assessment

100% on new assessment 89% on previous version

100%

Assessment C Candidate Effect on P-12 Learning

No data available

No data available

100%

Assessment D Professional Dispositions

97% on previous version of assessment

97% on new assessment 95% on previous version

43.5%

The results in the above table report the percentage of students who scored with proficiency overall for each of the assessment tasks.

 

AY 15-16

AY 16-17

AY 17-18

Program Completer Survey—Student Teachers

98%

100%

99%

Mentor Teacher Survey

100%

100%

96%

The results in the above table report the percentage of students and mentor teachers in AY 15-16 who reported their overall satisfaction with the College of Education and includes those who reported they were very, mostly, and somewhat satisfied. Starting in AY 2016, the data reflects they extent to which students and mentor teachers they believed they or their teacher candidates were prepared as a result of their teacher education program. This percentage includes those who reported they or their teacher candidate were very prepared, mostly prepared, or somewhat prepared.

 

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

15) Please briefly describe how the program used the results.

As a result of the variability in the data (although most candidates met the SLOs with proficiency, there was wide variability within the range of scores considered to be proficient), we engaged last semester and this semester in scoring reliability activities. Over the course of three faculty meeting, all elementary education faculty discussed scoring issues and scored and discussed their scoring to come to agreement regarding the common assessment tasks (Assessments A-D). Our discussions and scoring activities help to move us toward a common understanding of the evaluation critieria and interpretation of scoring rubrics.

 

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

In addition, based on the feedback from the indirect data, again, while students and mentor teachers reported that they felt that they were prepared based on their teacher education programs, one area, the teaching of Hawaiian language and culture stood out as an area where teacher candidates and mentor teachers felt less prepared. As a result, as a faculty, we are engaging in professional development to strengthen our knowledge with and comfort in integrating the Hawaiian language and culture into our pedagogy. We have taken trips to community sites where Hawaiian knowledge is integrated and engaged in informal language learning sessions after our faculty meetings.

17) If the program did not engage in assessment activities, please justify.

N/A