Unit: Institute for Teacher Education
Program: Elementary Education (BEd)
Degree: Bachelor's
Date: Thu Oct 08, 2015 - 12:53:01 pm

1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)

1. InTASC Standard 1. Learner Development The teacher understands how learners grow and develop, recognizing that patterns of learning and development vary individually within and across the cognitive, linguistic, social, emotional, and physical areas, and designs and implements developmentally appropriate and challenging learning experiences.

(1b. Specialized study in an academic field, 3b. Respect for people and cultures, in particular Hawaiian culture)

2. InTASC Standard 2. Learner Differences The teacher uses understanding of individual differences and diverse cultures and communities to ensure inclusive learning environments that enable each learner to meet high standards.

(1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history, 3b. Respect for people and cultures, in particular Hawaiian culture)

3. InTASC Standard 3. Learning Environments The teacher works with others to create environments that support individual and collaborative learning, and that encourage positive social interaction, active engagement in learning, and self-motivation.

(1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history, 2c. Communicate and report, 3b. Respect for people and cultures, in particular Hawaiian culture)

4. InTASC Standard 4. Content Knowledge The teacher understands the central concepts, tools of inquiry, and structures of the discipline(s) he or she teaches and creates learning experiences that make the discipline accessible and meaningful for learners to assure mastery of the content.

(1a. General education, 1c. Understand Hawaiian culture and history)

5. InTASC Standard 5. Application of Content The teacher understands how to connect concepts and use differing perspectives to engage learners in critical thinking, creativity, and collaborative problem solving related to authentic local and global issues.

(1b. Specialized study in an academic field, 2a. Think critically and creatively)

6. InTASC Standard 6. Assessment The teacher understands and uses multiple methods of assessment to engage learners in their own growth, to monitor learner progress, and to guide the teachers and learners decision making.

(1b. Specialized study in an academic field, 2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report)

7. InTASC Standard 7. Planning for Instruction The teacher plans instruction that supports every student in meeting rigorous learning goals by drawing upon knowledge of content areas, curriculum, cross-disciplinary skills, and pedagogy, as well as knowledge of learners and the community context.

(1b. Specialized study in an academic field)

8. InTASC Standard 8. Instructional Strategies The teacher understands and uses a variety of instructional strategies to encourage learners to develop deep understanding of content areas and their connections, and to build skills to apply knowledge in meaningful ways.

(1b. Specialized study in an academic field)

9. InTASC Standard 9. Professional Learning and Ethical Practice The teacher engages in ongoing professional learning and uses evidence to continually evaluate his/her practice, particularly the effects of his/her choices and actions on others (learners, families, other professionals, and the community), and adapts practice to meet the needs of each learner.

(2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report, 3a. Continuous learning and personal growth)

10. InTASC Standard 10. Leadership and Collaboration The teacher seeks appropriate leadership roles and opportunities to take responsibility for student learning, to collaborate with learners, families, colleagues, other school professionals, and community members to ensure learner growth, and to advance the profession.

(2c. Communicate and report, 3a. Continuous learning and personal growth, 3d. Civic participation)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: https://coe.hawaii.edu/students/ite-elementary/program-guidebooks/bed-elementary-education/elementary-education-program-12
Student Handbook. URL, if available online: https://coe.hawaii.edu/students/ite-elementary/program-guidebooks/bed-elementary-education/elementary-education-program-12
Information Sheet, Flyer, or Brochure URL, if available online: https://coe.hawaii.edu/students/ite-elementary/program-guidebooks/bed-elementary-education/elementary-education-program
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: Each course has specific learning outcomes based on the Interstate Teacher Assessment Support Consortium standards for the teaching profession and are included in course syllabi.
Other:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2015:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?

Yes
No (skip to question 16)

6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

7) Briefly explain the assessment activities that took place in the last 18 months.

Our program is focused on continuous improvement and we use assessment results for that purpose. We gathered data from multiple sources to identify strengths in our programs as well as areas in need of improvement. Our main questions follow:

1. Do our graduates have the requisite subject-specific content knowledge, as well as general and subject-specific teaching skills, that you need for beginning teaching?

We collected and evaluated direct evidence of student work/performance to determine SLO achievement, including course grades and evaluations of the students' capstone project. We also collected and analyzed Mentor Teachers' reports of SLO achievement via the Student Teaching evaluations and through surveys (indirect evidence).

2. To what degree are our graduates able to plan at levels required of successful beginning teachers?

We collected and evaluated direct evidence of student work/performance to determine SLO achievement, using evaluations of the students' capstone project. (Assessment Rubrics #1-5) We also used assessment results to make programmatic decisions, specifically in creating and requiring a new course in planning for, teaching, and assessing multi-lingual learners for the students in all B.Ed. Elementary Education programs.

3. To what degree are our graduates able to teach at levels required of successful beginning teachers?

We collected and evaluated direct evidence of student work/performance to determine SLO achievement, using evaluations of the students' capstone project. (Assessment Rubrics #6-10) We also used assessment results to make programmatic decisions, specifically in creating and requiring a new course in planning for, teaching, and assessing multi-lingual learners for the students in all B.Ed. Elementary Education programs.

4. To what degree are our graduates able to assess at levels required of successful beginning teachers?

We collected and evaluated direct evidence of student work/performance to determine SLO achievement, using evaluations of the students' capstone project. (Assessment Rubrics #11-15) We also used assessment results to make programmatic decisions, specifically in creating and requiring two new courses in classroom assessment for the students in the Exceptional Students in Elementary Education program. In addition, we created and require a new course in planning for, teaching, and assessing multi-lingual learners for the students in all B.Ed. Elementary Education programs.

5. To what degree do Mentor Teachers (clinical supervisors in field classrooms) perceive that our graduates have the knowledge, skills, and dispositions of effective beginning teachers?

We collected and analyzed indirect survey evidence of mentor teachers' perceptions of our students' knowledge, skills, and dispositions.

6. To what degree do COE field supervisors and school principals (our graduates' employers) perceive that our graduates have the knowledge, skills, and dispositions of effective beginning teachers?

We used assessment results to make programmatic decisions by establishing a means for tracking our graduates and following them into schools when hired as teachers in Hawaii. We will be collecting data on principals' perceptions of their success in the classroom and each first year teacher will have the support and guidance of a teacher education faculty member who will provide varying levels of support to assist them in making the transition into full-time teaching.

8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)

Direct evidence of student learning (student work products)


Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Other 1:
Other 2:

Indirect evidence of student learning


Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Other 1: We did distribute an exit survey to graduating student teachers but return rates were too low to meaningfully analyze the derived data. We are working to improve the survey return rate.
Other 2:

Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)


Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

We collected evidence for 115 graduating students.

We have data on Professional Dispositions for 422 students.

We did not sample; we collected data from all students.

10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other: College of Education Director of Assessment (Jessica Miranda)

11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other: An external organization (Pearson, Inc.) evaluated a small sample of our students' capstone projects. The faculty used Pearson's scores to help callibrate internal and external scoring to improve interrater reliability.

12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.

1. Do our graduates have the requisite subject-specific content knowledge, as well as general and subject-specific teaching skills, that you need for beginning teaching?

We collected and evaluated direct evidence of student work/performance to determine SLO achievement, including course grades and evaluations of the students' capstone project. We also collected and analyzed Mentor Teachers' reports of SLO achievement via the Student Teaching evaluations and through surveys (indirect evidence). Another indicator of content knowledge is the students’ scores on the Praxis II test, which are listed below:

  • Education Testing Service administers the Praxis II test. It measures subject-specific content knowledge, as well as general and subject-specific teaching skills, that is needed for beginning teaching.
  • Of the 147 elementary and secondary Bachelor of Education, College of Education 2014 graduates, 97% passed the Praxis II exam.

 

2. To what degree are our graduates able to plan at levels required of successful beginning teachers?

We collected and evaluated direct evidence of student work/performance to determine SLO achievement, using evaluations of the students' capstone project. (Assessment Rubrics #1-5) We also used assessment results to make programmatic decisions, specifically in creating and requiring a new course in planning for, teaching, and assessing multi-lingual learners for the students in all B.Ed. Elementary Education programs.

Planning Rubric Criteria

% Approaching

% Proficient

% Exceeding

1. Planning for Literacy Learning

11

46

43

2. Planning to Support Varied Student Learning Needs

29

36

36

3. Using Knowledge of Students to Inform Teaching and Learning

21

46

32

4. Identifying and Supporting Language Demands

29

29

43

5. Planning Assessments to Monitor and Support Student Learning

11

43

46

Percentages may not sum to 100%, due to rounding.

  • Students were assessed on their ability to plan for instruction
  • Planning for Literacy Learning emerged as a strength with 89% of students meeting or exceeding proficiency.
  • Planning for Assessment was, in prior years, a relative weakness. Our students have made significant improvements since we have focused on assessment in our methods courses. 89% of students met or exceeded proficiency.
  • Areas targeted for improvement include “Planning to Support Varied Student Learning Needs” and “Identifying and Supporting (academic) Language Demands.”

 

3. To what degree are our graduates able to teach at levels required of successful beginning teachers?

We collected and evaluated direct evidence of student work/performance to determine SLO achievement, using evaluations of the students' capstone project. (Assessment Rubrics #6-10) We also used assessment results to make programmatic decisions, specifically in creating and requiring a new course in planning for, teaching, and assessing multi-lingual learners for the students in all B.Ed. Elementary Education programs.

Instruction Rubric Criteria

% Approaching

% Proficient

% Exceeding

6. Learning Environment

4

71

25

7. Engaging Students in Learning

18

64

18

8. Deepening Student Learning

29

46

25

9. Subject Specific Pedagogy

7

54

39

10. Analyzing Teaching Effectiveness

21

50

29

Percentages may not sum to 100%, due to rounding.

  • Our students do well in creating and maintaining a positive learning environment, with 96% meeting or exceeding proficiency.
  • Focal areas for improvement are engaging students and deepening their learning.

 

4. To what degree are our graduates able to assess at levels required of successful beginning teachers?

We collected and evaluated direct evidence of student work/performance to determine SLO achievement, using evaluations of the students' capstone project. (Assessment Rubrics #11-15) We also used assessment results to make programmatic decisions, specifically in creating and requiring two new courses in classroom assessment for the students in the Exceptional Students in Elementary Education program. In addition, we created and require a new course in planning for, teaching, and assessing multi-lingual learners for the students in all B.Ed. Elementary Education programs.

Assessment Rubric Criteria

% Approaching

% Proficient

% Exceeding

11. Analysis of Student Learning

21

61

18

12. Providing Feedback to Guide Further Learning

39

36

25

13. Student Use of Feedback

71

25

4

14. Analyzing Students' Language Use and Literacy Learning

39

50

11

15. Using Assessment to Inform Instruction

32

36

32

Percentages may not sum to 100%, due to rounding.

  • We are pleased to see that more of our students are using assessment FOR instruction (as opposed to focusing on assessment OF instruction). 68% of our students use assessment to inform their instruction.
  • It seems that most (61%) of our students provide feedback to children on their achievement and growth; however, they need to take the next step in getting the children to use the feedback given to them to improve and deepen their learning.  

5. To what degree do Mentor Teachers (clinical supervisors in field classrooms) perceive that our graduates have the knowledge, skills, and dispositions of effective beginning teachers?

  • Data were gathered through a web-based survey to all mentor teachers. In Fall 2014, 43 of the 84 mentor teachers responded, resulting in a response rate of 51%. In Spring 2015, 111 of the 150 mentor teachers responded, resulting in a response rate of 74%.
  • 89% of mentor teachers believed that, as a new teacher who soon will be responsible for his or her own classroom, the student teacher showed that he or she is knowledgeable in his or her field of study.
  • 87% of mentor teachers believed that, as a new teacher who soon will be responsible for his or her own classroom, the student teacher showed that he or she is effective in his or her teaching practices.
  • 93% of mentor teachers believed that, as a new teacher who soon will be responsible for his or her own classroom, the student teacher showed that he or she is caring in his or her professional dispositions.

6.  To what degree do COE field supervisors and school principals (our graduates' employers) perceive that our graduates have the knowledge, skills, and dispositions of effective beginning teachers?

We used assessment results to make programmatic decisions by establishing a means for tracking our graduates and following them into schools when hired as teachers in Hawaii. We will be collecting data on principals' perceptions of their success in the classroom and each first year teacher will have the support and guidance of a teacher education faculty member who will provide varying levels of support to assist them in making the transition into full-time teaching.

Principals responded to a survey question, "As a group, I would describe the College of Education graduates who work in my school as..."

  • Knowledgeable: 94%
  • Effective: 87%
  • Caring: 91%

COE field supervisors ratings of students' professional dispositions revealed strengths and weaknesses. These results include ratings of students from entry into our programs through exiting at graduation. We use the ratings for early identification of students so we can provide constructive feedback and support while monitoring their progress.

  • 96% of students "Met Expectations" in "Professional and Ethical Conduct." This indicator includes responding to feedback in a solution-oriented manner, shows concern for children’s well-being and safety, believes that all students can learn, and treats children and others fairly.
  • 92% of students "Met Expectations" in “Self Reflection.” Students who meet expectations are aware of and are insightful about their own psychological, emotional, and professional characteristics and can monitor how they affect others and adjust behavior.
  • 24% of students were rated as "Needing Improvement" in “Effective Work Habits.” These students often struggled with punctuality, organization, and meeting program requirements and deadlines. The significant increase in percentage from last year (an 18% increase) may be due to our heightened awareness of and focus on early identification of students who struggle in these areas. After identifying these students, cohort coordinators draft “Letters of Intent” and then a “Plan of Assistance” that describe students’ strengths, areas for improvement, a timeline, and support they can receive to remedy any weaknesses.
  • Effective communication has emerged as a concern for our students. 18% of our students were identified as needing more support in communicating effectively, which includes their ability to “communicate clearly, openly, and respectfully with all members of the College of Education and partner school communities,” asking questions, and their ability to “speak and write in a clear and grammatically correct manner.” The latter was found to be most prevalent. In order to address this concern, we used the students’ writing samples completed at the in-person intake interview to identify those who need support in writing. These students, and others identified at different points while in the program, were assigned a writing tutor who provided initial instruction through online modules during the summer preceding admission to our program, and served to help improve writing skills and strategies while taking courses in our B.Ed. program.

13) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

14) Please briefly describe how the program used the results.

The goal of the College of Education is to prepare educators who are knowledgeable, effective, and caring professionals. The Elementary Education (B.Ed.) program is committed to the assessment of our academic programs in order to continually improve student learning and to meet professional education standards. Program assessment is an ongoing process for us, designed to monitor and improve student learning at the program (e.g., degree) level. We used key program assessments of students’ knowledge, skills, and dispositions in several ways.

Both direct and indirect evidence show that the Elementary Education (B.Ed.) Program is successful in preparing students to be knowledgeable, effective, and caring teachers. We have used multiple sources of data including a content knowledge and pedagogy exam (Praxis II), evaluation of a field-based, authentic capstone project, and student, field supervisor, mentor teacher, and employer surveys.

Although our students received high ratings and did well on assessments, there were recurring findings that suggested our graduates needed more assistance and guidance in teaching diverse learners,

As a result, Elementary Education (B.Ed.) has implemented several initiatives focused on improvement:

To ensure that all of students are proficient or exceed proficiency in planning, teaching, and assessment, all methods course instructors will be required to focus on at least one of the teaching/learning components in depth in each course. This will enable the students to have many, varied experiences and instructors can monitor, formatively assess, and give constructive feedback to the students before they are expected to demonstrate effective teaching during their final student teaching semester.

Programmatic decisions were made in creating and requiring a new course in planning for, teaching, and assessing multi-lingual learners for the students in all B.Ed. Elementary Education programs. The course content and strategies will help our students better meet the needs of diverse learners and will equip them with effective strategies to promote learning and growth.

Another data-driven initiative focuses on assessment. In order to strengthen our students’ assessment skills, faculty teaching the required Educational Psychology course bolstered the instruction on student assessment. More emphasis was placed on using data, and multiple sources of data, to plan for meaningful, relevant, and developmentally appropriate learning opportunities, as well as focusing on summative and high-stakes testing. Instructors in all methods courses taught both general assessment techniques as well as subject-specific ways to assess and evaluate children’s knowledge, skills, and dispositions.

Communication in writing emerged as an obstacle for many of our students. To address this need, we used their sample writing during the intake interviewing process to identify students who may struggle with writing, and 1-2 instructors were designated as department writing tutors. We started by offering an online summer remedial course in writing and, over the course of the semesters in our program, additional students were referred to receive writing tutor services (outside of courses). Students were also encouraged to seek help from the tutors at the UHM writing center.

We established a means for tracking our graduates and following them into schools when hired as teachers in Hawaii. We will continue to collect data on principals' perceptions of their success in the classroom. We are also continuing the induction and mentoring program called “SONG,” Supporting Our New Graduates. Faculty will provide assistance and support to our graduates who are teaching in Hawaii DOE schools. We have hired a specialist to dedicate work to this endeavor. Other faculty participate in partial fulfillment of their workload; they will provide varying levels of support to assist them in making the transition into full-time teaching. Strategic support and resources will be provided.

As a department, we will continue to ensure that our stated learning outcomes are consistent with our expectations and instruction. We will provide opportunities for our students to review and demonstrate mastery, collect and evaluate evidence of their learning throughout our program, interpret the results, and use them in ways that will lead to higher outcomes and growth for our students.

15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

We are using the capstone project, “edTPA” assessment to help us understand the effectiveness of our instruction in preparing our students to plan, teach, and assess. The "edTPA" is a rigorous performance-based assessment of our students that aligns with our SLOs and InTASC standards. Many faculty members are piloting its use with our students and we are in the process of redesigning some of our signature assessments to promote more effective teaching and learning. Through edTPA, we will be collecting specific evidence on students' ability to instruct. Segments of each student's teaching will be videotaped and each student will compose a written commentary and self-analysis of his/her instruction. This process is geared for graduating students and beginning teachers. It is also aligned with the National Board teacher certification (for advanced teachers) because the planning, teaching, assessment, reflection, and professional development components are similar.

16) If the program did not engage in assessment activities, please explain.

Not applicable.