Unit: Educational Administration
Program: Educational Admin (MEd)
Degree: Master's
Date: Fri Oct 02, 2020 - 8:07:29 am

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. Educational leaders are knowledgeable about and understand organizational life in schools/colleges and the dynamics of institutional change processes by examining trends, traditions, theory and policies of institutions in order to improve educational practice which promotes the learning success of all students.

(1. Demonstrate comprehensive knowledge in one or more general subject areas related to, but not confined to, a specific area of interest.)

2. Educational leaders understand, can articulate, and act within the moral/ethical, political, collaborative, strategic and caring dimensions of administrative roles within diverse cultural contexts.

(5. Proficiently communicate and disseminate information in a manner relevant to the field and intended audience., 6. Conduct research or projects as a responsible and ethical professional, including consideration of and respect for other cultural perspectives., 7. Interact professionally with others.)

3. Educational leaders demonstrate a well developed analytic capacity that is informed by theory, research, and practice to solve organizational problems and generate policy.

(2. Demonstrate understanding of research methodology and techniques specific to one’s field of study., 3. Apply research methodology and/or scholarly inquiry techniques specific to one’s field of study., 4. Critically analyze, synthesize, and utilize information and data related to one’s field of study.)

4. Educational leaders can apply knowledge and skills to changing organization contexts impacted by social, political, economic, cultural, and technological forces in order to foster the growth and development of the organization and its members.

(3. Apply research methodology and/or scholarly inquiry techniques specific to one’s field of study.)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: https://coe.hawaii.edu/edea/
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2020:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.


5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between November 1, 2018 and October 31, 2020?

No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period November 1, 2018 and October 31, 2020? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate other pressing issue related to student learning achievement for the program (explain in question 8)

8) Briefly explain the assessment activities that took place since November 2018.

In keeping with its accreditation process for AAQEP, the faculty met (generally monthly at department meetings) to review all asspect of assessment, including SLOs, key assessments for the department, and assessment documents.  In addition, faculty worked together to review copies of student work (with names and other identifiers redacted) to develop interrater reliabiity.  Faculty entered student outcomes on key assessments in the College's Student Information System.  Faculty also received survey data from recent completers, alumni and employers and reviewed these data to use to consider making programmatic changes that would improve students' learning experiences and program outcomes.  

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Completer /alumni surveys regarding their satisfaction with the program and how it prepared them were completed by 20 graduates in 2018-2020. 

Evidence was collected regarding student performance on course SLOs in the following courses between November 2018 and 2020:

EDEA 602  Research in Educational Administration       78 students

EDEA 610  School Community Relations                        77 students

EDEA 630K  Public School Law, K-12                            79 students

EDEA 657 Introduction to Higher Education                   76 students

EDEA 680  Curriculum Leadership                                 76 students

EDEA 695  Capstone in Higher Education                     72 students

EDEA 699  Leadership Portfolio                                     39 students                                                                  



11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)

13) Summarize the results from the evaluation, analysis, interpretation of evidence (checked in question 12). For example, report the percentage of students who achieved each SLO.

Data compiled from the courses entered in the College Student Information System indicated that over 95% of the students successfully achieved each SLO in the program.

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)

15) Please briefly describe how the program used its findings/results.


The results were used to discuss possible changes in the program, including resource allocation, course offerings (e.g., the course rotations), and, in particular courses, emphasis or focus of instruction to better direct teaching to student success.

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

Since the department has two strands, one in K-12 leadership and the other in Higher Education, it was helpful for faculty from each strand to look at the student work and expectations in the other strands.  It was also affirming to see that the vast majority of students are successful in achieving the SLOs and they are generally very satisfied with the program as a whole.

17) If the program did not engage in assessment activities, please justify.