Unit: Education (multiple departments)
Program: Professional Educational Practice (EdD)
Degree: Doctorate
Date: Thu Oct 08, 2015 - 10:31:46 pm

1) Below are your program's student learning outcomes (SLOs). Please update as needed.

  • SLO 1: Leaders in professional educational practice work collaboratively to solve problems and implement plans of action
  • SLO 2: Leaders in professional educational practice are able to apply research skills to bring about improvements in practice.
  • SLO 3: Leaders in professional educational practice can reflect critically and ethically on matters of educational importance.
  • SLO 4: Leaders in professional educational practice are able to take a broad, interdisciplinary perspective on a wide variety of educational issues

 

1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: https://coe.hawaii.edu/academics/educational-foundations/edd
Student Handbook. URL, if available online: https://coe.hawaii.edu/node/1528
Information Sheet, Flyer, or Brochure URL, if available online: https://coe.hawaii.edu/documents/1415
UHM Catalog. Page Number: http://www.catalog.hawaii.edu/schoolscolleges/education/grad.htm
Course Syllabi. URL, if available online: Course syllabi are available on Laulima and in the appendices of the Student Handbook.
Other: Laulima. See MAN EdD Cohort 2 at https://laulima.hawaii.edu/portal/site/2da80ada-3b85-41c4-b19f-7b6efb078f94
Other: Program FAQ at https://coe.hawaii.edu/academics/educational-foundations/edd

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2015:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?

Yes
No (skip to question 16)

6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

7) Briefly explain the assessment activities that took place in the last 18 months.

1.  Twenty-five graduates of the 29 members of the first cohort (86.20%) successfully completed, orally presented (July 2014), and filed their capstone dissertations in practice.  They graduated in Summer 2014.  (Cohort I) Note: Results were previously included in 2014 report.

2.  Twenty-one of the 25 graduates of Cohort I responded to the EdD Program Completion Survey.  (Cohort I)

3.  All members of our second cohort had their Group Consultancy Projects approved by IRB in Spring 2015.  (Cohort II)

3.  A program survey of Cohort II was administered in order to collect student perceptions of our class environment, the cohort's summer session experiences, recommendations for improving the program's summer experiences, program experience up to that point (Year 1 of 3), and recommendations for program improvement.

The survey was available during this time period:  7/14/2015 - 8/12/2015

Twenty-two out of 26 members of Cohort Two responded.  (Cohort II)

8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)

Direct evidence of student learning (student work products)


Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Other 1:
Other 2:

Indirect evidence of student learning


Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Other 1:
Other 2:

Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)


Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Dissertation Used for Program-Level Assessment in Addition to Individual Student Evaluation (Cohort I, Summer 2014 and Spring 2015)

Twenty-five out of the 29 members submitted their dissertations in practice.  The sample includes all Cohort I completers.

 

EdD Program Completion Survey, 2014 (Cohort II, Fall 2014 and Spring 2015)

Twenty-one out of the 24 completers (at that time) responded to an online survey.  All students were sent the electronic survey.  The survey was available during the follwing time period: 

11/5/2014 - 2/6/2015
 

IRB Approval of Research (Cohort II, Spring 2015)

By Spring 2015, all twenty-six students of Cohort II received IRB approval for their Group Consultancy Projects, which is one of our program's two major projects reflecting the program's signature pedagogy.  Signature pedagogy, which involves inquiries centered on problems of practice, describes the fundamental ways that practitoners are educated for improving their professional practice.  Their IRB applications were submitted by group, according to their particular group project.  All projects received IRB approval.

 

Student Surveys That Contain Self-Reports of SLO Achievement  (Cohort II, Summer 2015)

Twenty-two out of twenty-six students responded to an online survey.  All students from Cohort II were sent the electronic survey.  The survey was available during the follwing time period: 

7/14/2015 - 8/12/2015
 
 

10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other: Program Director / Graduate Chair

11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.

  Summer 2015 SLO Survey Items

 

 

Description

Strongly Disagree

Disagree

Neutral

Agree

Strongly Agree

Total

Weighted Average

 

SLO 1: My consultancy group used the Group Consultancy Work Days to make progress on our project.

 

 

0.00%

0

 

0.00%

0

 

4.76%

1

 

38.10%

8

 

57.14%

12

 

 

21

 

 

4.52

 

SLO 2: The summer curriculum provided me with opportunities to develop as a scholar.

 

 

0.00%

0

 

0.00%

0

 

5.00%

1

 

20.00%

4

 

75.00%

15

 

 

20

 

 

4.70

 

SLO 3: The summer curriculum provided me with opportunities to reflect critically and ethically on matters of educational importance.

 

 

0.00%

0

 

5.00%

1

 

5.00%

1

 

20.00%

4

 

70.00%

14

 

 

20

 

 

4.55

 

SLO 4: This summer, I was able to explore a variety of perspectives on issues concerning education.

 

 

0.00%

0

 

0.00%

0

 

0.00%

0

 

45.00%

9

 

55.00%

11

 

 

20

 

 

4.55

 

 

13) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

14) Please briefly describe how the program used the results.

The EdD Director provided the results to our faculty advisors and mentors, also known as our Council of Advisors, which includes select representatives from our educational partners from the community.  We then discussed our results in light of adjusting our immediate and our overall long-term curriculum plans for the current cohort.  Specifically, we used the data to inform our decisions to reorder the flow of activity in the current semester's curriculum.  We also used the data from both Cohorts I and II to restructure our summer session curriculum.  We reordered the sequence of courses for Cohort II based on data collected from Cohort I. 

We also used the results to inform our planning for the next EdD cohort, which will begin in Summer 2017.  For example, we recently submitted the appropriate UHM-2 form to request a program change.  Specifically, based on our data, we determined that we needed to make our EDUC 710: Group Consultancy Project course repeatable one more time in order to more closely meet our needs, as indicated by our data.  Here's a summary of our proposed change:

"The Group Consultancy Project is one of two major projects required by the program.  The change being requested is an increase in the repeat limit from one time to two times.  The change is being requested because satisfactory completion of Group Consultancy Projects may require more time than currently available under the existing repeat limit."

As a program focused on the improvement of professional educational practice, our faculty members and mentors are very vigilant and committed to using the program data that we collect for our own program improvement.

15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

The achievement of having one of our students win the Dissertation in Practice Award from our international consortium of over 80 instititutions, provides great validation for the quality of our students, our curriculum, our program processes, and the efforts of our faculty and mentors.  This year's 2015 Dissertation in Practice competition was the first time that any of our program's EdD graduates would have been eligible to enter.  I expect that our future graduates will continue to produce dissertations in practice at or near the heights achieved by our first cohort.

Also, the Doctor of Education (EdD) Day proclamation that honored our program and its graduates has provided valuable recognition from our state's executive leaders for the particular contributions of our EdD program.  This public validation of our program is another great achievement that occurred in the past year.

16) If the program did not engage in assessment activities, please explain.

Not applicable.