Unit: Institute for Teacher Education
Program: Elementary Education (BEd)
Degree: Bachelor's
Date: Sun Oct 18, 2009 - 11:10:26 am

1) List your program's student learning outcomes (SLOs).

Our goal as a college is to employ and prepare educators who are knowledgeable, effective, and caring professionals who contribute to a just, diverse, and democratic society. Specifically, the EECE program learning outcomes are aligned with the standards of the Association of Childhood Education International (ACEI),upon which our accreditation with the National Council of Teacher Accreditation is based. We received national recognition as an accredited program in 2007, continuing through 2012. The alignment of the ACEI Standards with the Hawaii Teacher Standards follows. In addition, the specific learning outcomes of each content area course is based on the National/International standards for that area.

ACEI Standard: I. Development, Learning, and Motivation:  Candidates know and understand, and use the major concepts, principles, theories, and research related to development of children and youth to construct learning opportunities that support individual students’ development, acquisition of knowledge, and motivation.

Hawai‘i Teacher Standards (HTS)

1. Focuses on the Learner
2. Creates and Maintains a Safe and Positive Learning Environment
4. Fosters Effective Communication in the Learning Environment

ACEI Standard: II. Curriculum: Candidates demonstrate a high level of competence in their knowledge and application of the central concepts, tools of inquiry, and structures of content for students across the K-6 grades in the areas of English language arts, science, mathematics, social studies, the arts, health education, and physical education.

 Hawai‘i Teacher Standards (HTS)

5. Demonstrates Knowledge of Content

ACEI Standard: III. applying knowledge for instruction—Candidates plan and implement instruction based on knowledge of students, learning theory, subject matter, curricular goals, and community.

Hawai‘i Teacher Standards (HTS)

3. Adapts to the Learner
6. Designs and Provides Meaningful Learning Experiences
7. Uses Active Student Learning Strategies

ACEI Standard: IV. Assessment: Candidates know, understand, and use formal and informal assessment strategies to plan, evaluate and strengthen instruction that will promote continuous intellectual, social, emotional, and physical development of each elementary student.

Hawai‘i Teacher Standards (HTS)

8. Uses Assessment Strategies

ACEI Standard: V. Professionalism: Candidates understand and apply practices and behaviors that are characteristic of developing career teachers. 

Hawai‘i Teacher Standards (HTS)

9. Demonstrates Professionalism
10. Fosters Parent and School Community Relationships

2) Where are your program's SLOs published?

Department Website URL: http://students.coe.hawaii.edu/Departments/Elementary_Education
Student Handbook. URL, if available online: http://students.coe.hawaii.edu/Departments/Elementary_Education
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other:
Other:

3) Upload your program's current curriculum map(s) as a PDF.

Curriculum Map File(s) from 2009:

4) What percentage of courses have the course SLOs explicitly stated on the course syllabus, department website, or other publicly available document? (Check one)

0%
1-50%
51-80%
81-99%
100%

5) State the SLO(s) that was Assessed, Targeted, or Studied

 All of our student learning goals are assessed in multiple forms over the course of the two year EECE program. We have eight program assessments of student learning. Our focus this past year has been on evaluating and revising two of these eight assessments - Assessment #3 - Analysis of K-6 Student Learning, and Assessment # 5 - Analysis of K-6 learning. These two assessments both address all of our student learning objectives, but are not the only ones that do so.

6) State the Assessment Question(s) and/or Goal(s) of Assessment Activity

1. To what degree are course instructors consistent on their ratings of student work for EECE assments # 3 and # 5? 

2. What factors may explain inconsistencies in the ratings?

3. How can the results inform our assessment practices?

7) State the Type(s) of Evidence Gathered

 Faculty collected samples of student submissions for assessment 3 and 5 at varying levels of acheivement.

8) State How the Evidence was Interpreted, Evaluated, or Analyzed

1. Faculty met to arrive at consensus on interpreting the levels of student acheivement and performance on the assessment # 3 and # 5 rubrics.

2. Instructors independently rated a sample of student work

3. Instructors got together in small groups to compare ratings and discuss similarities and differences in the scoring

4. Assessment coordinator ran in inter-rater reliability analysis using SPSS software.

9) State How Many Pieces of Evidence Were Collected

1. For assessment # 3 (student planning), six student work samples were collected and we had 18 raters.  

2.  For assessment # 5 (impact on student learning), four student work samples were collected and we had 11    raters.

We had more work samples and raters for assessment # 3 because this assessment was attached to all content area courses, where assessment # 5 is only attached to the student teaching experience.

10) Summarize the Actual Results

Assessment 3:

Most of the variance in the scores is due to differences in the performances in the students (person variance) being rated (74% of the total variance), and not differences in raters or the components (~26%). The components and raters work well in differentiating students’ performance. Combined sources of errors, therefore, contribute 24.9% to the variance in scores, which is relatively small compared with the person effect. The generalizability coefficient ( ) = 0.96. This suggests that the combination of raters and components in evaluating lesson performance yields information that is highly reliable.

Assessment 5:

Variance Estimates

The sum of the error components = 0.29.

 

The percentage of each variable due to each one.

 

Persons: 0.0%

Components: 0.022/.29  =    7.6%

Raters is 0.085/.29          =  29.3%

Persons*components

    0.035/.29                     =  12.1%

Persons*rater 

    0.086/.29                     =  30.0%

Components*raters         =    0.0%

Error 0.067/.29                =  23.1%

 

Most of the variability is due to rater components and not to differences between the performances. This may indicate that raters had difficulty understanding the rubric or the components in the performance.

11) Briefly Describe the Distribution and Discussion of Results

Results were shared at a faculty meeting. Discussion focused on how we could revise and improve the assessments themselves, to better address the SLOs, and also ways to increase consistency in the rating of the assessments.

12) Describe Conclusions and Discoveries

The results of this process provided evidence that faculty were more consistent in the ratings for Assessment 3, than for 5. Several possible reasons for this were discussed. Ultimately, the discussion led to the conclusion that these two assessments needed to be revised to better address the student learning outcomes.

13) Use of Results/Program Modifications: State How the Program Used the Results --or-- Explain Planned Use of Results

We used the information gathered from our inquiry to revise the Assessment 3 and 5 assignments and rubrics. We are piloting the revised formats this semester and will continue to evaluate their effectiveness and consistency in rating student work.

14) Reflect on the Assessment Process

The process helped the faculty identify similarities and differences in using the assessment 3 and 5 rubrics to rate student work. It further spurred us to revise the assessment assignments and rubrics themselves.

Next time we will increase the number of work samples and raters for assessment 5, and perhaps ask students and faculty outside of our dept. for input.

15) Other Important Information

none

16) FOR DISTANCE PROGRAMS ONLY: Explain how your program/department has adapted its assessment of student learning in the on-campus program to assess student learning in the distance education program.

The same assessments and assessment process are used with students in our statewide distance learning program.

17) FOR DISTANCE PROGRAMS ONLY: Summarize the actual student learning assessment results that compare the achievement of students in the on-campus program to students in the distance education program.

This is something that we have not done, but are beginning this semester.