Unit: Curriculum Studies
Program: Early Childhood Education (MEd)
Degree: Master's
Date: Sat Nov 17, 2018 - 4:31:10 pm

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. Standard 1 - Child Development: MEd ECE graduates are knowledgable about the developmental needs of young children from the prenatal period to eight years of age. As early childhood educators who care about children achieving their maximum potential, they use that knowledge to effectively create programs that support children's optimal development and employ strategies that engage and empower families in an ethical and culturally sensitive manner.

(1. Demonstrate comprehensive knowledge in one or more general subject areas related to, but not confined to, a specific area of interest., 4. Critically analyze, synthesize, and utilize information and data related to one’s field of study.)

2. Standard 2 - The Field of Early Learning: MEd ECE graduates are knowledgable about current issues and tends in early childhood care and education. As early childhood educators who care about the larger needs of the community, they use that knowledge to effectively provide ethical and culturally sensitive, place-based leadership and advocacy with regard to policy, decision-making in government agencies, and their own programs.

(1. Demonstrate comprehensive knowledge in one or more general subject areas related to, but not confined to, a specific area of interest., 4. Critically analyze, synthesize, and utilize information and data related to one’s field of study.)

3. Standard 3 - Professionalism: MEd ECE graduates are knowledgeable about what it means to be a professional in the field of early childhood education. As early childhood educators who care about the field, they work effectively in collaboration with families and other professionals to provide services in an ethical, caring and culturally sensitive manner. Candidates identify with and conduct themselves as members of early childhood profession. They know and use ethical guidelines and other professional standards related to early childhood practice.

(5. Proficiently communicate and disseminate information in a manner relevant to the field and intended audience., 6. Conduct research or projects as a responsible and ethical professional, including consideration of and respect for other cultural perspectives., 7. Interact professionally with others.)

4. Standard 4 - Research: MEd ECE graduates are knowledgeable about the role of research in the field of early childhood education. As early childhood educators who care about using research-based strategies and methods, they effectively reflect on their current practice and initiate their own research projects. They critically analyze and apply current research in a manner that is sensitive to the children, families and communities they serve.

(2. Demonstrate understanding of research methodology and techniques specific to one’s field of study., 3. Apply research methodology and/or scholarly inquiry techniques specific to one’s field of study., 4. Critically analyze, synthesize, and utilize information and data related to one’s field of study., 5. Proficiently communicate and disseminate information in a manner relevant to the field and intended audience., 6. Conduct research or projects as a responsible and ethical professional, including consideration of and respect for other cultural perspectives.)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: https://coe.hawaii.edu/academics/curriculum-studies/med-ece
Student Handbook. URL, if available online: https://coe.hawaii.edu/academics/curriculum-studies/med-ece
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2018:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

No
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between June 1, 2015 and October 31, 2018?

Yes
No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period June 1, 2015 to October 31, 2018? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
No (skip to question 17)
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other: Collect survey and anecdotal data on interested applicants to determine barriers and supports to applying for the program. Identify supports needed for underrepresented student populations. Incorporate changing accountability requirements (institutional, accreditation, professional standards, licensing) into assessment methods.

8) Briefly explain the assessment activities that took place.

The MEd Early Childhood Education operates on a 3 year cycle. A new cohort of students entered in Summer 2015 (Cohort V) and Summer 2018 (Cohort 2018).

We regularly collect direct assessment data on student achievement as cohorts progress through coursework through key assessments embedded in each of the mandatory courses over the 3 years of the program. Summative data is also collected in the form of program capstones (Plan A or Plan B) and oral defense of capstone papers or projects.

 

Indirect assessment data is also collected in the form of: annual student surveys to assess program components and their progress in light of program SLOs and capstone requirements, individual student advising conferences held each semester the student is in the program, informal conversations with course instructors and Plan B advisors and readers and collegewide student exit surveys.

In Summer 2015, we revisited the Plan B options in light of draft ILOs from Graduate Division. Up until this time, a majority of the  students (90% or more) completed a Plan B professional portfolio for their capstone. However, the increased emphasis on research competencies led us to revisit the program capstone. The cohort accepted in 2015 was the first group to pilot the transition to a Plan B Paper (study) or Project that grew out of designing a research proposal and piloting this in EDCS 632 Qualitative Research Methods. This course is taken in the Summer/Fall of the second year in the program.

Consequently, many of the assessment activities between 2015 and 2018 focused on evaluating the Plan B capstone in light of revisions to institutional ILOs and program standards, revisiting coursework, and strengthening supports that students received as they used the capstone process to strengthen their knowledge and competency relative to institutional ILOs and program standards. The Program Director/Assessment Coordinator compiled data from student assessments and also consulted with Plan B advisors and instructors for key courses. The Faculty Steering committee for the program evaluated the data as the 2015 Cohort of students graduated from the program in 2017 and evaluated the impact of changes prior to the admitting the 2018 Cohort of students.

Concurrently, the program is in the process of responding to a growing demand for teachers with early childhood license for our public PreK system.  "Add-a-Field" license in early childhood education was approved by the Hawaii Teacher's Standards Board in Fall 2017. Existing assessments were identified and evaluated to assess their effectiveness in preparing students to address license requirements if the student's educational goals included obtaining a license upon graduation (this has ranged with 20% to 40% of students enrolled in the program expressing interest in license). 

Finally, the program has also engaged in assessment activities between 2015-2018 to identify actions that could be taken to sustain enrollment. We were concerned about viability of the program in light of overall decline in graduate program enrollment and because the MEd in Early Childhood Education has historically served a high number of students from vulnerable and underrepresented student populations. Students from these groups have diminished in number over time. Assessment activities included surveying and conducting informal interviews with current and prospective students and community stakeholders to identify barriers to enrollment and retention.

 

 

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1: Faculty Steering Committee and Faculty Advisor feedback on procedures and process implemented to increase supports for students in strengthening research competencies.
Other 2: Potential applicant survey, anecdotal email and pre-application advising conference data about barriers/supports to pursing a graduate degree in early childhood education.

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

A total of 36 students submitted direct evidence in the form of program assessments embedded in coursework or as part of the Plan B Capstone process. Between 2015-2017, data was collected from 1 student from MEd ECE Cohort III, 18 students from MEd ECE Cohort V, and 17 students from MEd ECE Cohort VI was collected. No sampling techniques were used since we collected data on all students enrolled in the program.

 

Table 10.1. Students Submitting Program Assessments

 

Term/Semester(s)

 

Number of Students

 

Courses / Formative & Summative Assessments

 

SLOs

 

Program Standards

Summer 2015

(Cohort V)

18  

 

18 

FamR 491 – Literature Review

EDCS 667B – Community Mapping

1, 5, 7

 

1, 7

1

 

2, 3

Fall 2015-Spring 2016

18

Seminar – Annotated Bibliography / Capstone Progress Report

1, 7

1, 2, 4

Summer 2016

 

18

FamR 454 – Policy Brief

4, 7

2, 4

 

Fall 2016

 

18

EDCS 632 – Qualitative Research Proposal and Pilot Study

Seminar – CITI Certificates

1, 2, 3, 4, 5, 6, 7

 

6

1, 2, 3, 4

 

4

 

Spring 2017

 

17

Seminar – Capstone Progress, Chapter 1, 3 Drafts

IRB Applications

1, 2, 3, 4, 5, 6 

 

6

 

1, 2, 3, 4

 

Summer 2017

16

EDCS 618 – Analysis of Ethical Dilemma & Professional Development Plan

6, 7

Fall 2017

15 

Plan A or Plan B Capstone (Summative Assessment)

Orals (Summative Assessment)

 

1, 2, 3, 4, 5, 6, 7

1, 2, 3, 4

Summer 2018

(Cohort VI)

16

 

16 

 FamR 491 - Literature Review

EDCS 667B – Community Mapping

1, 5, 7

 

1, 7

1

 

2, 3

Indirect data collected from student annual program evaluations and program level completion surveys were collected for the years 2015, 2016, 2017, 2018. 

 

Table 10.2. Students Completing MEd ECE Annual Program Survey

 

Year

Number of Students

2015

16

2016

17

2017

18

2018 

16

 

 

Table 10.3.  Students Submitting College of Education Program Completer Survey


Year

Number of Students

2015-2016

1

2016-2017

13

 

Data was also informally solicited from all faculty involved in the program. This took the form of post-teaching de-briefs with instructors for mandatory and elective courses and informal de-briefings with faculty advisors. 6 faculty members responsible for teaching the mandatory courses were included in the debriefings. In addition, 8 faculty members who advised students for Plan A and Plan B capstones were also consulted on the students' process and areas were we could improve coursework, advising and support for the capstone development. This information was shared with the Faculty Steering Committee and used in decisions on what initiatives to maintain and areas to modify.

 

 

 

 

11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other: Assessment Director for MEd ECE Program

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

13) Summarize the results of the assessment activities checked in question 7. For example, report the percentage of students who achieved each SLO.

Direct assessment data was collected from a total of 36 students.

Many of the program assessments in the first two years (2015-2016) that Cohort V was progressing through the program served as formative assessments. They focused on the addressing the draft SLOs introduced by Graduate Studies. The research proposal developed and piloted in EDCS 632 in Fall 2016, was a first opportunity to assess student outcomes in light of all 7 program SLOs and served as a formative assessment halfway through the program.

In Fall 2017, the program capstone (a Plan A or Plan B Paper/Project and Orals) completed in the final semester of the program for Cohort V was a summative assessment where students needed to be able to demonstrate essential competency in all 7 SLOs for the program. 7 students demonstrated essential competencies and 8 students exceeded program expectations. 3 students in Cohort V did not meet capstone requirements. 2 students needed to take a stay from completing their coursework and/or capstones due to medical conditions or job related factors that hindered their progress. The program is in touch with these students and they have an inidividualized plan to complete the program on an extended timeline. 1 student did not complete the program and has since dropped from status. 

Table 13.1 Student Learning Outcomes Rubric Scores for Key Program Assessments

 

 

Term/

Semester(s)

  Number of Students

Courses / Formative & Summative Assessments

 

*SLOs Addressed

Did Not Meet Expectations

(Incomplete)

Demonstrated Essential Competencies

Exceeds Program Expectations

 

Summer 2015

(Cohort V)

 

18  

 

 

FamR 491 – Literature Review

 

1, 4, 7

 

 

2

 

12

 

4

 

Fall 2015-Spring 2016

 

17

Seminar – Annotated Bibliography / Capstone Progress Report

 

 

1, 7

 

 

2

 

 

 

11

 

 

4

 

Summer 2016

 

 

18

 

FamR 454 – Policy Brief

 

4, 7

 

0

 

2

 

16

 

Fall 2016

 

 

18

EDCS 632 – Qualitative Research Proposal &  Pilot Study

Seminar – CITI Certificates

 

1, 2, 3, 4, 5, 6, 7

 

6

 

0

 

0

 

2

 

18

 

16

 

NA

 

 

Spring 2017

 

 

17

Seminar – Capstone Progress, Chapter 1, 2, 3, 4 Drafts (IRB if relevant)

 

1, 2, 3, 4, 5, 6, 7 

 

3

 

14

 

0

 

 

Summer 2017

 

16

EDCS 618 – Analysis of Ethical Dilemma

Professional Development Plan

 

6,7

 

1, 2, 7

 

 

1

 

0

 

 

5

 

9

 

11

 

8

 

Fall 2017

 

15 

Plan A or Plan B Capstone & Orals (Summative Assessment)

 

1, 2, 3, 4, 5, 6, 7

 

3

 

7

 

8

 

 Summer 2018

(Cohort VI)

 

17

 

 

 

16 

FamR 491 - Literature Review

 

EDCS 667B – Community Mapping

 

1, 5, 7

 

 

1, 7

 

1

 

 

0

 

9

 

 

10

 

7

 

 

6

*Data was not disaggregated by individual SLOs in rubrics for assignments although rubrics include descriptors that address appropriate program SLOs. Faculty advisors discussed student performance relative to individual program SLOs in course debriefs and throughout the advising process for Cohort V, who were enrolled in the program while revisions were being implemented to address new institutional, professional and licensing standards. After the first group submitted the new capstones (Plan A & B), the Faculty Steering Committee evaluated the Plan A and B capstones in light of program SLOs.

 

Student survey data also provided indirect evidence of self-reported student outcomes as a result of the program. Two types of surveys are regularly administered. Although program evaluations are administered annually, this report includes just data from the MEd ECE Program Survey for the most recent graduating group (Fall 2017). This is probably the best indicator of program outcomes because it shows student perceptions of competence as they completed the program. 

Table 13.2  Student Self-Reported Evaluation of Program Effectiveness in Developing Competency in Program Standards and Student Learning Outcomes (MEd ECE Program Evaluation Survey - Cohort V, Year 3, Fall  2017)

Program Standards / SLOs

Not at All

Minimally

Adequately

More Than 

Adequately

Exceptionally

MEd ECE Standard 1

SLO 1,4

 

 

5.88%

(n=1)

41.18%
(n=7)

52.94%
(n=9)

MEd ECE Standard 2

SLO 1,4

 

 

 

29.41%

(n=5)

70.59%

(n=12)

MEd ECE Standard 3

SLO 7

 

 

5.88%

(n=1)

41.18%

(n=7)

52.91%

(n=9)

MEd ECE Standard 4

SLO 2, 3, 4, 5, 6

 

 

 

23.53%

(n=4)

76.47%

(n=13)

 

A second source of student data is the College of Education Program Completion Survey which is sent to all students who are graduating from the program in a semster. Data from the most recent graduating group (Fall 2017) is also included because it provides complimentary yet corroborating evidence of student perceptions of competence as they exited the program. 

Table 13.3 Student Self-Reported Evaluation of Program Effectiveness in Developing Competency in Program Standards and Student Learning Outcomes (College of Education Program Completer Survey - Fall 2017) 

Statement

Program Standards / SLOs

Strongly

Disagree

Disagree

Neither Agree or Disagree

Agree

Strongly Agree

The master’s program helped me to become more knowledgeable about my field.

MEd ECE Standard 1 & 2  / SLO 1, 2, 4

 

 

 

31%
(n=5)

69%
(n=11)

The master’s program helped me to develop important new skills in my field

MEd ECE Standard 1 & 2 / SLO 1, 4

 

 

6%

(n=1)

31%

(n=5)

63%

(n=10)

The master’s program helped me to grow as an educational professional.

MEd ECE Standard 3 / SLO 5, 6, 7

 

 

 

31%

(n=5)

63%

(n=10)

The master’s program helped me to develop my knowledge of research methodology.

MEd ECE Standard 4 / SLO 2  

 

 

 

20%

(n=3)

80%

(n=12)

The master’s program helped me to develop my ability to apply research skills.

MEd ECE Standard 4 / SLO 3  

 

 

6%

(n=1)

19%

(n=3)

75%

(n=12)

The master’s program helped me to develop my writing skills.

MEd ECE Standard 3, 4 / SLO 5

 

6%

(n=1)

 

13%

(n=2)

81%

(n=13)

 

In general, direct assessments indicate that a majority of the students in the most recently graduated Cohort met or exceeded program expectations for demonstrating knowledge and competency in the Program Standards and Student Learning Outcomes (SLOs) for the institution. The degree to which they met or exceeded program expectations differed when comparing faculty and student ratings for different SLOs and program standards. In general, faculty were more critical in assessing the student's research knowledge and competencies and knowledge about their field of study. There was outlier data from one student who felt that the development of her writing and research skills could be further supported by the program.

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other: Evaluate and revise the interaction between cohort coordination, individual and group advising, sequencing of coursework, seminars, and scaffolding of Plan B Capstone across coursework to better support students in demonstrating increased rigor in research competencies.

15) Please briefly describe how the program used the results.

1. Alignment of Outcome Measures with Various Accountability Systems. The program revised program SLOs and key assessments to align with Graduate Division's revised ILOs and the professional standards used for accreditation of teacher preparation programs in the discipline. Assessment activities also concurrently evaluated coursework, key assessments, and other program components in light of requirements from the Hawaii Teacher Standards Board for students pursuing "add-a-field" license. We cross referenced the accountability expectations (Graduate Division ILOs, standards for professional preparation programs standards for early childhood education, licensing standards) and reorganized this into a matrix to be used to address different accountability systems. With the help of the MAO faculty and College of Education Assessment Coordinator, we are in the process of identifying how to main our data collection methods (e.g. key assessments and surveys) and streamlining reporting specific to the various reporting systems.

 

2. Program Changes. (Course Syllabi, Content, Program Assessments, Academic Supports) We identified the data and key assessments and that could be submitted as evidence for different purposes (Institutional, Accreditation, License). Based on these guiding outcome measures we:

  • Revised and refined course content and SLOs, sequencing of courses, program capstone (Plan B) Paper or a Project, student handbook and Laulima Resource Collection. We are also in the process of working with the Hamilton Library librarian for our field to create a guide for students of the program and to identify publications that can be added to the collection.
  • Instituted monthly on-line seminars using "zoom" technology to provide student support and enhance the scaffolding of Plan B and license competencies over the life of the program of study. The seminars intentionally support Plan B "works in progress" teaching them research skills such as using a reference management system and will introduce students to research methodologies and topics of interest.
  • Created foundational coursework and field based assessments to address "gaps" in practitioner knowledge and competencies needed for students pursuing license. This is a challenge in the field of early childhood education and not specific to this program because educational pathways at the associate and baccalaureate level are inconsistent nationwide.

 

3. Barriers to Enrollment and Retention. The program is concerned about equity in student representation and factors that dispoportionately impact particular demographic groups within the field of early childhood education. There were experienced early childhood educators who were identified as potential students by undergraduate faculty and system leaders in their respective communities, but who come from vulnerable student populations. The program has experienced a large drop in enrollment from these potential student populations over time. Data collected helped us to identify the cost of a graduate degree was the greatest barrier confronting potential and continuing students. Some of the other factors affecting student enrollment and retention are: geographic or social isolation, multiple demands on time and resources, and access. Concerted effort over the past few years has increased financial assistance available to students and enhanced wrap around advising to help students identify and successfully apply for funding that could lower the cost of pursuing a degree. Secondarily, other program changes were initiated to provide social and academic support for students from rural Oʻahu, neighbor islands and non-resident students; and for students from vulnerable groups (e.g. geographically isolated, first to college, underrepresented racial minorities, low-wage earners, multilingual learners) who express a need for more support navigating the social and cultural context of graduate school. One example of a support that was enhanced is the more robust use of technology as a tool to connect and provide community for the students in the program in the form of on-line seminars that provide a professional learning community during the academic year.

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

Addressing Multiple Accountability Systems. The program is in the process of trying to determine how to negotiate the various assessment demands on the program in light of a diverse student population with varying educational and employment objectives. We are in the process of trying to determine the best way to align and answer to multiple accountability systems (institutional, accreditation, teacher license) and outcome measures (ILOs, program and professional standards, license requirements). One major insight focuses on possible ways to consolidate data collection and storage so that it is all in one place and to also differentiate data retrieval so that it is aligned with appropriate outcome measures needed to address the criteria and focus required by different reports. Also, the program is in the process of identifying and streamlining what data will be submitted for specific reports rather than trying to provide comprehensive reports that include all assessments for all reports. This is in process and we hope to better disaggregate the data across emerging outcome measure systems as these become more stable. Professional standards for the field of early childhood education and accreditation standards for teacher preparation programs are in a period of transition, making it somewhat difficult to align and organize a data collection system. We are, however, making overall progress in conceptualizing how to address these multiple demands.

 

Sustaining Enrollment in a Program with a Vulnerable Student Population. We are pleased that enhanced indirect assessment methods initiated in 2012 yielded data that has able enabled us to try and mitigate the decline in enrollment numbers. We were very concerned that the downward trend in student enrollment in graduate professional programs (39% since 2009 and 12% between 2017-2018) would disproporionately affect this program, which like the overall field of early childhood education, is composed of a high proportion of students from vulnerable populations. Indirect program evaluation activities helped to identify challenges and supports. Beginning inf 2014, we began to direct energy to address the biggest barrier identified, the cost of attending school as compared with the current and potential future wages students might anticipate as a result of completing a degree. Although more sustainable sources of funding need to be pursued, we have been able to access some funds that were instrumental in minimizing enrollment declines relative to overall institutional enrollment declines for masters level professional preparation programs. Additional survey data is helping us to better understand the "wrap around" supports needed to retain and support students through program completion.

 

Table 16.1 Enrollment Decline in Overall Professional Masters Programs and the Master in Early Childhood Education

Master Level Professional Preparation Programs 

 Enrollment Decline (%)

2009 (Peak) to 2018

2017 to 2018

Overall Masters Professional Programs

39%

12%

MEd in Early Childhood Education

30%

10%

 

17) If the program did not engage in assessment activities, please justify.

Not applicable.