Unit: Educational Psychology
Program: Educational Psychology (PhD)
Degree: Doctorate
Date: Mon Oct 18, 2010 - 5:55:40 am

1) Below are the program student learning outcomes submitted last year. Please add/delete/modify as needed.

1.    Educational Psychology graduate students are learning and development, inquiry methods, and student assessment.

2.    Educational Psychology graduate students have inquiry skills to conduct scholarly research effectively.

3.    Educational Psychology graduate students present scholarly research effectively.

  1. Educational Psychology graduate students model the ethical treatment of research participants.

2) As of last year, your program's SLOs were published as follows. Please update as needed.

Department Website URL: http://www.coe.hawaii.edu/edep
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online: NA
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:

3) Below is the link to your program's curriculum map (if submitted in 2009). If it has changed or if we do not have your program's curriculum map, please upload it as a PDF.

Curriculum Map File(s) from 2009:

4) The percentage of courses in 2009 that had course SLOs explicitly stated on the syllabus, a website, or other publicly available document is indicated below. Please update as needed.


5) State the assessment question(s) and/or goals of the assessment activity. Include the SLOs that were targeted, if applicable.

The program faculty wanted to know whether candidates:

1. Were knowledgeable about learning and development, inquiry methods, and student assessment (SLO 1).

2. Had the inquiry skills to conduct scholarly research effectively (SLO 2).

3. Could present scholarly research effectively (SLO 3).

4. Modeled the ethical treatment of research participants (SLO 4).

6) State the type(s) of evidence gathered.

We collected candidates' dissertation proposals and final papers and documentation of whether their research had been approved by the UH Committee on Human Studies. 

7) Who interpreted or analyzed the evidence that was collected?

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)

8) How did they evaluate, analyze, or interpret the evidence?

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)

9) State how many persons submitted evidence that was evaluated.
If applicable, please include the sampling technique used.

Four candidates were assessed. Note that not all assessments were made for each candidate during the reporting period.

10) Summarize the actual results.


The proposal of only one candidate was assessed. He met all expectations for the literature review and method section.

Human Subjects Review

The one candidate who was assessed received human subjects’ approval for his dissertation research. 

Final Paper

There were two candidates who were assessed for their dissertation final paper. The assessments revealed that one candidate was rated higher than the other across half of the domains.

 Final Presentations

There were three candidates who were assessed. The candidate, whose final paper was rated lower than the other candidate, was also rated lower on the final presentation. The other two candidates were rated high, except in the case of time limits. One candidate did not meet the standard because his presentation was too long.

11) How did your program use the results? --or-- Explain planned use of results.
Please be specific.

The faculty found it difficult to determine implications for the results, as there were so few candidates assessed. They suggested that in the future, we look at aggregated assessments across assessment periods.

12) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, program aspects and so on.


13) Other important information: