Unit: Curriculum Studies
Program: Early Childhood Education (MEd)
Degree: Master's
Date: Mon Nov 23, 2020 - 5:25:14 pm

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. Standard 1 - Child Development: MEd ECE graduates are knowledgable about the developmental needs of young children from the prenatal period to eight years of age. As early childhood educators who care about children achieving their maximum potential, they use that knowledge to effectively create programs that support children's optimal development and employ strategies that engage and empower families in an ethical and culturally sensitive manner.

(1. Demonstrate comprehensive knowledge in one or more general subject areas related to, but not confined to, a specific area of interest., 4. Critically analyze, synthesize, and utilize information and data related to one’s field of study.)

2. Standard 2 - The Field of Early Learning: MEd ECE graduates are knowledgable about current issues and tends in early childhood care and education. As early childhood educators who care about the larger needs of the community, they use that knowledge to effectively provide ethical and culturally sensitive, place-based leadership and advocacy with regard to policy, decision-making in government agencies, and their own programs.

(1. Demonstrate comprehensive knowledge in one or more general subject areas related to, but not confined to, a specific area of interest., 4. Critically analyze, synthesize, and utilize information and data related to one’s field of study.)

3. Standard 3 - Professionalism: MEd ECE graduates are knowledgeable about what it means to be a professional in the field of early childhood education. As early childhood educators who care about the field, they work effectively in collaboration with families and other professionals to provide services in an ethical, caring and culturally sensitive manner. Candidates identify with and conduct themselves as members of early childhood profession. They know and use ethical guidelines and other professional standards related to early childhood practice.

(5. Proficiently communicate and disseminate information in a manner relevant to the field and intended audience., 6. Conduct research or projects as a responsible and ethical professional, including consideration of and respect for other cultural perspectives., 7. Interact professionally with others.)

4. Standard 4 - Research: MEd ECE graduates are knowledgeable about the role of research in the field of early childhood education. As early childhood educators who care about using research-based strategies and methods, they effectively reflect on their current practice and initiate their own research projects. They critically analyze and apply current research in a manner that is sensitive to the children, families and communities they serve.

(2. Demonstrate understanding of research methodology and techniques specific to one’s field of study., 3. Apply research methodology and/or scholarly inquiry techniques specific to one’s field of study., 4. Critically analyze, synthesize, and utilize information and data related to one’s field of study., 5. Proficiently communicate and disseminate information in a manner relevant to the field and intended audience., 6. Conduct research or projects as a responsible and ethical professional, including consideration of and respect for other cultural perspectives.)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: https://coe.hawaii.edu/academics/curriculum-studies/med-ece
Student Handbook. URL, if available online: https://coe.hawaii.edu/academics/curriculum-studies/med-ece
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2020:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

No
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between November 1, 2018 and October 31, 2020?

Yes
No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period November 1, 2018 and October 31, 2020? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate other pressing issue related to student learning achievement for the program (explain in question 8)
Other: Collect survey and anecdotal data from interested applicants, current students to identify barriers and supports to applying for and persisting in the program. Identify supports need for underrepresented student populations. Review key policy and institutional documents that inform accountability systems.

8) Briefly explain the assessment activities that took place since November 2018.

The MEd Early Childhood Education operates on a 3 year cycle. A new cohort of students entered in Summer 2018 (Cohort VI).

Between November 1, 2018 October 31, 2020, direct assessment data was collected as students progressed through coursework on key assessments embedded in each of the program’s mandatory courses and some of the common electives for license track students. The program also collected formative and summative data on student capstones (Plan A or Plan B) and oral presentations. The data collection period excludes the first intensive Summer session (2018) and December (2020) summative data for the final semester of the current cohort (Fall 2020). Therefore, it mainly comprises formative data and is missing the beginning and ending of a complete data cycle (Summer 2018 – Spring 2021). One student who was a carryover from a previous cohort completed her Plan B capstone, a summative assessment for the program in Spring 2020. Students in the current cohort will complete during this academic year (Fall 2020-Spring 2021) and data is not yet available on their capstones.

Indirect assessment data was also collected in the form of: annual student surveys to assess program components and progress in light of program standards, SLOs and capstone requirements, student advising records, informal conversations with course instructors and Plan B advisors and readers, collegewide student exit surveys, faculty steering committee records and feedback from the college's assessment coordinator on accreditation report data.

Between Fall 2018 and Fall 2020, assessment activities primarily focused on collecting data to assess the effectiveness of revised program content, key assessments and capstones (Plan A & B) in light of:

  • updated Graduate Division ILOs,
  • new national professional standards and competencies for the discipline,
  • reaccreditation of the unit under a new accreditation system; and, 
  • state requirements for newly approved license tracks.

Assessment data was also collected to evaluate progress in addressing barriers to enrollment and persistence for students from underrepresented demographic groups and vulnerable student populations.

 

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1: Potential applicant survey, emails and pre-application advising conference data about barriers/supports to pursuing a graduate degree in early childhood education
Other 2:

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Multiple participant groups contributed to the assessment data collected.

  • All students from Cohort V and VI enrolled in the program between November 1, 2018 and October 31, 2020 (n=16).
  • A sample of recent graduates which included 10 of the 15 students who completed the program in Fall 2017. The sample includes all of the graduates who replied to the survey/interview request (n=10). 
  • A sample of stakeholders and employers which included 10 of 44 heads of agencies or organizations who hire or interact with graduates of the program. The sample (n=10) includes all of the stakeholders who replied to the survey request and agencies or organizations were chosen because students have been hired by this organization and program faculty  collaborate with these entities workforce development efforts.
  • All of the faculty currently associated with the program: the faculty steering committee, instructors and advisors (n=14).
  • All students inquiring about the MEd ECE program between November 1, 2018 and October 31, 2020 (n=117).

Data was collected from a total of 167 persons with some persons contributing to multiple data sources. This data is summarized in Table 1. Persons Contributing to Assessment Data.

 

Table 1. Persons Contributing to Assessment Data

 

  Assessment 

 

Group

 

Number  of Participants

 

SLO Addressed

Course Embedded

Key Assignments

Cohort VI Students

15

1, 2, 3, 4, 5, 6, 7 

Capstone Work Product 

Cohort V Student

1

1, 2, 3, 4, 5, 6, 7 

CITI Certification and/or IRB Approval of Research

Cohort V Student

Cohort VI Students

1

15

4

2, 3, 4

Oral Performance             (Oral Defense)

Cohort V Student

1, 2, 3, 4, 5, 6, 7

Program Completer Survey & Interviews

Cohort V Students

10

1, 2, 3, 4, 5, 6, 7

Employer / Stakeholder Survey

Employers, State Agencies, Advocacy Groups, Private Organizations

10

1, 4, 5, 6, 7

Annual Program Evaluation Surveys

Cohort VI Students

15

Organized by Program Standards. Survey for Dec 2020 revised to disaggregate data by SLOs.

Assessment-Related (e.g. Accreditation Report Data / Curriculum Map / Program Approvals)

 

Faculty / Accreditation Coordinators

 

6

 

1, 2, 3, 4, 5, 6, 7

Course Materials (Syllabi, Assignments, Assessment Rubrics) and Instructor Observations*

Faculty (Instructors) & Program Administrators

7

1, 2, 3, 4, 5, 6, 7 

Faculty Steering Committee & 

Advisor Feedback*

Steering Committee 

 

Advisors

3

 

8

1, 2, 3, 4, 5, 6, 7 

 

1, 2, 3, 4, 5, 6, 7

Applicant Feedback

Potential Applicants

117

NA

*Informal feedback collected from faculty involved in the program took the form of post-teaching de-briefs with course instructors and informal de-briefings with faculty advisors. 6 faculty members responsible for teaching courses was included in the debriefings. In addition, 8 faculty members who advised students for Plan A and Plan B capstones were also consulted on the student’s process and areas where we could improve coursework, advising and support for capstone development. The anecdotal evidence from faculty working with students was discussed by the Program Director with the Faculty Steering Committee consisting of one member from each of the 3 departments involved in the interdisciplinary program. 

11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

13) Summarize the results from the evaluation, analysis, interpretation of evidence (checked in question 12). For example, report the percentage of students who achieved each SLO.

Direct assessments 

All 16 of the students completed key assessments embedded in coursework or the capstone/orals. The data on student ability to demonstrate competencies relative to the SLOs is summarized in Table 2. Student Performance on Direct Assessments.

All of the students enrolled in the program demonstrated essential competency or exceeded expectations in relation to four of the seven SLOs (1, 4, 5 & 6). Students performing best in relation to SLO 1 – Comprehensive Knowledge, with (38%) demonstrating essential competency and (62%) exceeding expectations and a mean of 1.6. In comparison, most of the students demonstrated essential competency (60%) with regard to SLO 4 - Critically Analyze, Synthesize, & Utilize Information & Data and between (40%) exceeded expectations in relation to this SLO with a mean of 1.3. Only one student completed a capstone and this student demonstrated essential competency for SLO 5 - Communicate & Disseminate Information Appropriately and SLO 6 - Responsible, Ethical, Professional Conduct of Research. All of the students completed CITI Certification relevant to their areas of study.

15 of the 16 students or (94%) demonstrated essential competency or exceeded expectations in relation to three of the seven SLOs (2, 3 & 7). The SLOs where students scored lowest focused on research. While a majority of the students met essential competencies (81%) with regard to SLO 2 - Understanding of Research Methodology, only (13%) exceeded expectations with a mean of 1.1. Likewise, 12 of the 16 students, or (75%) met essential competency in SLO 3 - Research Methodology and Scholarly Inquiry Techniques and only 3 students or (19%) exceeded expectations for this SLO with a mean of 1.1. By contrast, 6 of the 16 students (38%) exceeded expectations in demonstrating SLO 7 - Interact Professionally as compared to (56%) or 9 or 16 students demonstrating essential competency in this area with a mean of 1.3.

Data for the current cohort (n=15) is formative. Only 1 student who needed to extend her term beyond the previous cohort completed a capstone and orals during this time period. Since the capstone and orals is a summative assessment for the program, we will have more complete direct assessment data of student abilities as they exit the program after the current group finishes their program of study. There is not enough summative data to draw conclusions about student performance relative to SLOs as students exit the program in this data cycle.

Table 2Student Performance on Direct Assessments

 

 

SLO

 

Courses / Formative & Summative Assessments

 

Sem

 

N

 

Did Not Meet Expectations

(0)

 

Demonstrated Essential Competencies

(1)

 

Exceeds

Expectations

(2)

 

 

Mean

SLO 1 Comprehensive Knowledge

·  International Public Policy Analysis (HDFS 454)

·  Plan A or Plan B Capstone & Orals (Summative Assessment)

Sum 19

 

Spr 20

 

15

 

1

 

0

 

0

6

 

0

9

 

1

1.6

 

2

SLO 2 Understanding of Research Methodology

·  Qualitative Research Proposal & Pilot Study (EDCS 632)

·  Plan A or Plan B Capstone & Orals (Summative Assessment)

Sum 19

 

 

Spr 20

 

15

 

 

1

1

 

 

0

12

 

 

1

 2

 

 

0

1.1

 

 

1

SLO 3 Research Methodology and Scholarly Inquiry Techniques

·  Qualitative Research Proposal & Pilot Study (EDCS 632)

·  Plan A or Plan B Capstone & Orals (Summative Assessment)

Sum 19

 

 

Spr 20

 

 

15

 

 

1

 

 

1

 

0

12

 

0

2

 

1

1.1

 

2

SLO 4 Critically Analyze, Synthesize, & Utilize Information & Data 

·  Practical Application of Research and Theory (EDCS 656)

·  International Public Policy Analysis (HDFS 454)

·  Plan A or Plan B Capstone & Orals (Summative Assessment)

Sum 18

Sum 19

 

Sum 19

 

Spr 20

 

14

13

 

15

 

1

0

0

 

0

 

0

 

 

10

9

 

7

 

0

 

 

4

4

 

8

 

1

1.3

1.3

 

1.5

 

2

SLO 5 Communicate & Disseminate Information Appropriately 

·  Plan A or Plan B Capstone & Orals (Summative Assessment)

Spr 20

 

1

0

1

0

1

SLO 6 Responsible, Ethical, Professional Conduct of Research

·  CITI Certificates

·  Plan A or Plan B Capstone & Orals (Summative Assessment)

Sum 19

 

Spr 20

 

15

 

1

0

 

0

15

 

1

NA

 

0

1

 

1

SLO 7 Interact Professionally

·  Professional Development Plan (EDCS 618)

·  Plan A or Plan B Capstone & Orals (Summative Assessment)

Sum 20

 

Spr 20

15

 

1

1

 

0

 

8

 

1

 

6

 

0

1.3

 

1

  

Indirect Assessment Data 

Multiple sources of indirect assessment data appear to concur with findings that a majority of the students who graduate from the program demonstrate essential competency or exceed expectations for all of the SLOs for the program. SLO 1 - Comprehensive Knowledge and SLO – 7 Interact Professionally were areas of strength. SLO 2 & 3, which focus on research, were areas not mentioned by stakeholders/employers, but are areas where program completers expressed increased confidence in their competencies.

10 of the 15 students who graduated in Fall 2017 participated in a five-question open-ended survey or individual interview as part of the unit accreditation process in March 2020. All of the students expressed the feeling that they were well prepared to engage in professional practice. Although the surveys/interview did not specifically ask students to report competencies in light of SLOs, students indicated they were well prepared in their knowledge base in early childhood and child development and able to critically consume and apply professional literature (SLO 1 and 4). 50% (n=5) of the program completers expressed confidence in their ability to access credible and current research studies or to design and conduct research (SLO 2, 3, and 6). More than half commented that they were well-prepared and ready to engage collaboratively with others in leadership or advocacy efforts (SLO 5 and 7). Graduates expressed a few areas of their knowledge base that could be strengthened such as content focusing onyoung children with special needs or infants and toddlers.  

 

Stakeholder/Employer Surveys (Spring 2020)

Stakeholder/Employer Surveys also conducted as part of a unit accreditation process in March 2020 yielded similar favorable evaluations of the knowledge, competencies and dispositions of program graduates. Graduates were described as possessing strong content area knowledge, with child development and pedagogy as areas of strength (SLO 1). Employers also described graduates as having a strong foundation, “readiness through ethical behaviors” (SLO 6), the ability to “engage in discussions, inquiry, and problem solving around issues and practices” (SLO 5 and 7). The students were also described as “conscientious”, “reflective” (SLO 4), “able to engage in collaborative work” (SLO 7) and mindful about how to translate theory to practice (SLO 4). Stakeholders/employers suggested content and experiences they would like to be strengthened by the program; however, the comments appeared more to be addressed to the program in general rather than commenting on the SLOs demonstrated by graduates of the program. For example, one respondent commented, “UH needs to do more providing for the needs of early childhood education in Hawaiʻi. There needs to be more community and workplace embedded efforts.” Stakeholders/employers did not mention in their feedback any assessment of the program completer’s performance relative to  SLOs focusing on research competencies (2 and 3). 

Annual Program Evaluation Surveys (Fall 2018, Fall 2019)

Students in the program participate in annual program surveys every Fall. The survey was revised after the Graduate Division introduced new ILOs in Fall 2018. Data collected in the Fall 2019 survey evaluated each of the 8 major program standards and all 7 ILOs simultaneously for a total of 56 possible responses. Upon review of thisdata, it appears that the data was too disaggregated for the information to be meaningful. In Fall 2020, the survey has been modified to better isolate the data to address institutional ILOs for advanced programs.

Other Sources of Indirect Assessment Data

Other sources of indirect data were gathered as part of program accreditation and license review processes. SLOs were explicitly stated in program literature such as course maps, program handbooks, syllabi, and assignments, indicating an intentional focus to address each SLO within courses, assignments, grading rubrics and in capstone activities. Areas of continued need in light of feedback from instructors and advisors include SLOs focusing on research (SLO 2 and 3) and critical thinking (SLO 5). This is consistent with the direct assessment data indicating a continued need to strengthen research capacity and critical analysis skills and general trends in the preparation of practitioners for the discipline within the larger broader state and national context.

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other: Advocacy (e.g. participating in drafting legislative bills for policy changes; providing data to justify creation of license tracks, revising of program formats)

15) Please briefly describe how the program used its findings/results.

The program has been in the process of anticipating and responding to multiple changes in accountability criteria during this data cycle: 1) adoption of new ILOs (2018), 2) reaccreditation of the unit coinciding with changes in the unit’s accreditation body (2018-2020), 3) revisions to disciplinary professional standards (2020); and, 4) the addition of state teacher license requirements for new license tracks (2017-2020). As well, post COVID-19 revisions impact program efforts to maintain quality and continue to make progress in addressing the increased rigor of new ILOs and professional standards while streamlining, consolidating, increasing enrollment, and pivoting the programs to prepare students who are pursuing license tracks.

Therefore, assessment data collected, changes in multiple accountability systems, and economic considerations are resulting in major program revisions. These include: remapping course content and revising syllabi, key assessments, rubric criteria and data collected for the collegewide on-line assessment system. We are revising program literature and handbooks to reflect changes as we recruit for the next cohort, which will begin in Summer 2021 as well as the assessment system data collection. This information is being shared with faculty across the three departments who teach or advise students. 

Student’s out-of-course experiences are intentionally considered in assessment activities. The program historically serves vulnerable student populations. Practitioners in the field are disproportionately impacted by economically hardship and challenges managing work-life-school stressors or navigating higher education systems. For this reason, data gathered on barriers or challenges encountered by potential and current students and the effectiveness of program supports (financial, social, academic, professional) is vital. Data collected in the Annual Program Evaluation Survey, recruitment advising and conversations with instructors, advisors and students informed the enhancement of wrap around resources and supports for students in the following areas: 

  • financial assistance advising specific to the major,
  • peer and mentoring support,
  • resource collections to support development of research competencies; and,
  • opportunities for engagement in professional organizations and to develop relationships with potential employers.

Over 50% of the current cohort of students has been impacted by furloughs, lay-offs, or loss of a job due to COVID-19 and a majority have needed to modify or change their capstone plans entirely in the final year of the program as a result of the pandemic. Financial hardship continues to be the most frequently mentioned factor impacting student applications, enrollment and persistence. Therefore, the program also is involved advocacy work collaborating with state agencies and philanthropy to increase funding available for workforce development (e.g. loan forgiveness, scholarships and   tuition assistance) and to engage in research to inform workforce development policy proposals.

The current cohort has not yet completed the program. We will celebrate 3 (20%) students completing the program in December 2020. 12 (80%) of the current students will be extending their studies by a semester. All of the students (100%) continue to make steady progress towards their degree. Summative evaluation data for this group is pending and is expected to inform program revisions in order to support the next cohort of students in persisting in the midst of a pandemic.

 

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

A focus of traditional assessment measures such as the achievement of SLOs, accreditation, professional and license standards capture only part of the story when you are providing a program in a discipline which serves a largely non-traditional and vulnerable population. There are additional significant challenges to be considered in assessment practices as they provide a fuller picture of the resiliency and success of students as well as the hardships they encounter. The program needs to collect assessment data in order to provide the supportive factors and the resources that are needed in order to facilitate student persistence. Data also can help guide efforts to sustain and strengthen the program’s ability to provide research grounded advanced practitioners who can provide leadership in the discipline for the state.  

 

Addressing multiple accountability systems and requirements simultaneously can be challenging for a program with a small infrastructure and limited resources. While there is increased demand on the part of policymakers and the public for early childhood programs and practitioners, there are substantial challenges to providing preparation programs for this student population.

 

17) If the program did not engage in assessment activities, please justify.

Not applicable. The program engaged in assessment activities.