Unit: Institute for Teacher Education
Program: Elementary Education (BEd)
Degree: Bachelor's
Date: Fri Nov 20, 2020 - 2:20:41 pm

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. InTASC Standard 1. Learner Development The teacher understands how learners grow and develop, recognizing that patterns of learning and development vary individually within and across the cognitive, linguistic, social, emotional, and physical areas, and designs and implements developmentally appropriate and challenging learning experiences.

(1b. Specialized study in an academic field, 3b. Respect for people and cultures, in particular Hawaiian culture)

2. InTASC Standard 2. Learner Differences The teacher uses understanding of individual differences and diverse cultures and communities to ensure inclusive learning environments that enable each learner to meet high standards.

(1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history, 3b. Respect for people and cultures, in particular Hawaiian culture)

3. InTASC Standard 3. Learning Environments The teacher works with others to create environments that support individual and collaborative learning, and that encourage positive social interaction, active engagement in learning, and self-motivation.

(1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history, 2c. Communicate and report, 3b. Respect for people and cultures, in particular Hawaiian culture)

4. InTASC Standard 4. Content Knowledge The teacher understands the central concepts, tools of inquiry, and structures of the discipline(s) he or she teaches and creates learning experiences that make the discipline accessible and meaningful for learners to assure mastery of the content.

(1a. General education, 1c. Understand Hawaiian culture and history)

5. InTASC Standard 5. Application of Content The teacher understands how to connect concepts and use differing perspectives to engage learners in critical thinking, creativity, and collaborative problem solving related to authentic local and global issues.

(1b. Specialized study in an academic field, 2a. Think critically and creatively)

6. InTASC Standard 6. Assessment The teacher understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teachers and learners decision making.

(1b. Specialized study in an academic field, 2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report)

7. InTASC Standard 7. Planning for Instruction The teacher plans instruction that supports every student in meeting rigorous learning goals by drawing upon knowledge of content areas, curriculum, cross-disciplinary skills, and pedagogy, as well as knowledge of learners and the community context.

(1b. Specialized study in an academic field)

8. InTASC Standard 8. Instructional Strategies The teacher understands and uses a variety of instructional strategies to encourage learners to develop deep understanding of content areas and their connections, and to build skills to apply knowledge in meaningful ways.

(1b. Specialized study in an academic field)

9. InTASC Standard 9. Professional Learning and Ethical Practice The teacher engages in ongoing professional learning and uses evidence to continually evaluate his/her practice, particularly the effects of his/her choices and actions on others (learners, families, other professionals, and the community), and adapts practice to meet the needs of each learner.

(2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report, 3a. Continuous learning and personal growth)

10. InTASC Standard 10. Leadership and Collaboration The teacher seeks appropriate leadership roles and opportunities to take responsibility for student learning, to collaborate with learners, families, colleagues, other school professionals, and community members to ensure learner growth, and to advance the profession.

(2c. Communicate and report, 3a. Continuous learning and personal growth, 3d. Civic participation)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: https://coe.hawaii.edu/elem/
Student Handbook. URL, if available online: https://docs.google.com/document/d/12XfD_a6v3PYAosx_n6d2wbbLAw4kCmqnIDDjz1Qk1Wg/edit
Information Sheet, Flyer, or Brochure URL, if available online: https://coe.hawaii.edu/elem/programs/
UHM Catalog. Page Number: http://www.catalog.hawaii.edu/schoolscolleges/education/ite.htm
Course Syllabi. URL, if available online: https://drive.google.com/drive/folders/0B1Jw84HKwoUBUkpSd18wMW5JOFk

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2020:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.


5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between November 1, 2018 and October 31, 2020?

No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period November 1, 2018 and October 31, 2020? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate other pressing issue related to student learning achievement for the program (explain in question 8)

8) Briefly explain the assessment activities that took place since November 2018.

We have engaged in a large number of assessment activities as they are fundamental to our work as program faculty focused on continual program improvement. As part of each monthly cohort coordinator and faculty meeting, we engage in assessment activities. Program accreditation activities are a consistent agenda item. Collaborative work related to continuous improvement included 1) review of survey data (student teacher, mentor, and alumni) with discussions of implications and action steps, 2) review and refinement of implementation processes of common assessments and pedagogical supports to discuss shared challenges and solutions as well as provide support for new faculty, 3) formal scoring reliability activities and discussions of validity for common assessments, 4) introduction to and conversations about innovative teacher education practices from those within and outside the department to deepen our own knowledge and pedagogical practices and pass these to the students 5) discussion of the AAQEP standards, state teacher performance standards, discipline-specific standards, and Danielson components to deepen our shared understanding of these requirements and to examine how we are meeting the standards across courses, 6) review and refinement of program curriculum maps and core syllabi, 7) discussion of informal and formal feedback from community partners and alumni, and 8) discussion of the mission and vision of the program across and within program tracks to nurture a shared vision. The goal for all of these endeavors is to ensure we are working collaboratively to prepare the teacher candidates to fulfill their professional roles as effectively as possible based on assessment data. As part of accreditation reporting, the faculty compiled and analyzed data related to the program SLOs and accreditation standards, reviewed the final accreditation reports, and discussed the implications of the review process ahead of the accreditation visit this spring. Additionally, faculty discussed the impact of data collection (each cohort coordinator modified/adapted/limited) the assessment data they collected using the college-wide assessments based on COVID (candidates were not in field or engaged in virtual learning).

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Direct Evidence based on Common Assessments


AY 17-18

AY 18-19

AY 19-20

Assessment A 





Assessment B

Student Teaching Evaluation 




Assessment C Candidate Effect on P-12 Learning




Professional Dispositions




By 2017-2018, students were required to complete all four common assessments, and they were required to successfully complete the assessments in order to either advance to the next part of the program or for successful program completion and recommendation for licensure, therefore all available data for the students for each assessment were included in the sample.

Indirect Evidence


AY 17-18

AY 18-19 AY 19-20

Program Completer Survey—Student Teachers 



Data not yet available

Mentor Teacher Survey  



Data not yet available

The student teachers who are our program completers are sent the survey we request but do not require that they complete and submit the survey. The same is true for mentor teachers. Due to the impacts in the spring, there are not yet survey data available. In addition, an alumni survey was completed in Fall 2020, n=62. Additional program completer feedback was collected by cohort coordinators using open-ended prompts in Spring 2020. Out of 222 spring 2017-Fall 2019 program completers, 39 responded, a response rate of 18%.


11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Other: Assessment coordinator

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)

13) Summarize the results from the evaluation, analysis, interpretation of evidence (checked in question 12). For example, report the percentage of students who achieved each SLO.

Grand Means Across Tracks for Each Aspect Assessment A Assessment A Highest Component Assessment A Lowest Component Assessment B Assessment B Highest Component Assessment B Lowest Component Assessment C Assessment C Highest Component Assessment C Lowest Component Assessment D Assessment D Highest Component Assessment D Lowest Component
Aspect 1a. Professional knowledge 1.72 1a. Demonstrates knowledge of content & pedagogy 1c. Sets instructional outcomes 2.14 4a. Reflects on teaching 3b. Uses questioning and discussion techniques 1.90 4a. Reflects on teaching 1f. Designs student assessments N/A    
Aspect 1b. Learners and learning theory 1.74 1a. Demonstrates knowledge of content & pedagogy and 1e. Designs coherent instruction 1a. Demonstrates knowledge of content & pedagogy and 1e. Designs coherent instruction 2.25 1b. Demonstrates knowledge of students 1e. Designs coherent instruction N/A     N/A    
Aspect 1c. Culturally responsive practice 1.73     2.11     N/A     2.34    
Aspect 1d. Assessment of and for learning 1.63     2.09     1.89     N/A    
Aspect 1e. Positive earnin and work environments N/A     2.21     NA     N/A    
Aspect 1f. Dispositions for professional practice N/A     2.27     1.94     2.36    
  1.705     2.18     1.91     2.35    
  Assessment A Assessment A Highest Component Assessment A Lowest Component Assessment B Assessment B Highest Component Assessment B Lowest Component Assessment C Assessment C Highest Component Assessment C Lowest Component Assessment D Assessment D Highest Component Assessment D Lowest Component
Aspect 2a. Understand local communities to foster relationships with families N/A     1.9 one one N/A     2.27 one one
Aspect 2b. Culturally responsive practices in diverse community contexts 1.73 one one 2.11 1b. demonstrates knowledge of students 4c. communicating with families N/A     2.34 one one
Aspect 2c. N/A     2.21 2a. Creates an environment of respect and rapport 2d. Manages student behavior N/A     N/A    
Aspect 2d. N/A     N/A     N/A     N/A    
Aspect 2e.       2.33 one one N/A     2.39    
Aspect 2f. N/A     2.27 4f. Shows professionalism 4d. Participates in the professional community N/A     2.38    
        2.16           2.35    


14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)

15) Please briefly describe how the program used its findings/results.

The teaching of the Hawaiian language and culture stood out as an area where teacher candidates and mentor teachers felt less prepared. As a result, as a faculty, we engaged during AY 2018-19 and have continued in professional development to strengthen our knowledge with and comfort in integrating the Hawaiian language and culture into our pedagogy. We have taken trips to community sites where Hawaiian knowledge is integrated and engaged in informal language learning sessions after our faculty meetings. We discuss how we are implementing and refining our practices in this area. While the results of these efforts are not yet evident in the data, this is an example of directly using data for program improvement.

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

Overall implications from the AAQEP study data:


  • ·      Need to look at the interview process—how can the applicants demonstrate potential competency?
  • ·      Cohort structure so valuable and the role of the cohort coordinator is pivotal
  • ·      Need to collaborate with OSAS to admissions, pre-req courses, conditional admit
  • ·      Explore data/scores for admission interviews as part of program review process  
  • ·      What is the value of Assessment C to teacher candidates? Little program evidence connected to this measure.
  • ·      Need to examine student supports, especially for struggling students (POAs and Off-track)
  • o   Tremendously resource-intensive (falls on cohort coordinators) to support students on POAs
  • o   Can we improve/streamline these processes so there is less of a burden on cohort coordinators and the process is efficient/effective?
  • ·      Examine the AAQEP standards as a faculty and if we’re providing evidence in the ways we want to
  • ·      How can we connect with and extend our support for program graduates?
  • ·      Areas of growth: Hawaiian language, questioning, and discussion, assessment, setting instructional goals, families
  • ·      Is there a measure that we want to add or pilot related to CRP, especially if it replaces Assessment C?
  • ·      Discussion again would be helpful around the criteria for what’s acceptable or makes the cut for each measure as well as the support provided to candidates for each assessment


17) If the program did not engage in assessment activities, please justify.