Unit: Educational Foundations
Program: Educational Foundations (MEd)
Degree: Master's
Date: Wed Sep 16, 2015 - 3:11:02 pm

1) Below are your program's student learning outcomes (SLOs). Please update as needed.

(Knowledge) Demonstrates analysis and critical thinking, and an understanding of concepts and/or theories and/or issues and/or complexities of the subject.

(Knowledge) Demonstrates a socio-cultural or historical or philosophical or comparative understanding of the subject.

(Knowledge) Demonstrates the ability to synthesize information coherently.

(Skills) Writing or Presentation is organized, clear, and engaging. If applicable, uses correct grammar, spelling, punctuation, and proper citation.

(Disposition) Keeps an open mind to multiple perspectives and interpretations.

1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL:
Student Handbook. URL, if available online: NA
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: NA
Other:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2015:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?

Yes
No (skip to question 16)

6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

7) Briefly explain the assessment activities that took place in the last 18 months.

Collect and evaluate student performance to determine SLO achievement.

1. Knowledge. Written reflective responses on readings and discussions. Candidtaes demonstrate their analysis and critical thinking, and an understanding of key concepts, theories, and issues in the course.

2. Knowledge. Presentations. Candidates make presentations that demonstrate their understanding of issues related to the course content, be it historical, philosophical, socio-cultural, or comparative issues in education.

3. Knowledge. Essays. Candidates demonstrate their ability to synthesize information from the course.

4. Skills. Written Reflections and Essays. Candidates demonstrate their ability to write pieces that are organized, clear, and engaging, without errors in grammar, spelling, punctuation, and citation.

5. Disposition.  Discussions, Presentations, and Written Work. Candidates demonstrate their open-mindedness to various perspectives and interpretations.

Collect and analyze student self-reports of SLO achievement via surveys.    Each semester, the Dean's office distributes program completer surveys to students in their last semester of their programs. Our data are published in reports aggregated by program, and viewable in our COE Intranet. The data are also shown on the COE public website, "Mesuring our Success."

8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)

Direct evidence of student learning (student work products)


Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Other 1:
Other 2:

Indirect evidence of student learning


Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Other 1:
Other 2:

Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)


Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

There were 85 candidates who took the courses. All of these students completed evidence used in the assesments. Student work was assessed using a rubric for each assessment.

10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other: College of Education Director of Assessment compiled survey results.

11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.

Re Collect/evaluate student work/performance to determine SLO achievement.

     Assessment for EDEF 630/683, Culturally Relevant Teaching/Social Inquiry. Faculty members were pleased with the results of student performance. Candidates did very well in their understanding of the subject, with 83% reaching target level. They also did very well in their professional dispositions, with 80% reaching target level. The percentages for the other standards were good: 77% for analysis and critical thinking, 74% for synthesis of information, 66% for writing and presentation skills. The overall average for reaching target level with the 5 standards was good: 76%. Faculty were very pleased that 99% of all students reached either target or acceptable levels. Only one percent (2 students) were deemed unacceptable in writing and presentation skills. The instructors who taught the course said that the two students struggled in writing clearly; with guidance from the instructors, they did improve, but they still needed to further improve their writing.

     Assessment for EDEF 669/671, Presentation on Readings/Education in a Country. Faculty were especially pleased with the students’ ability to make effective presentations, 93 % of whom reached target level. The same was so with students’ professional dispositions, with 93% reaching target level. Faculty were pleased that 70% of students reached target level in their analysis and critical thinking; faculty members were less satisfied with 59% reaching target level in their synthesis of information. On balance, faculty were pleased with the overall 79% reaching target level on the 4 standards for this course. They were very pleased with 100% reaching target or acceptable levels.

Re Collect/analyze student self-reports of SLO achievement via surveys.

     Faculty members were very pleased that all students—100%--responding to the survey either agreed or strongly agreed on the following: become more knowledgeable in my field; grow as an educational professional; develop my knowledge of research methodology; develop my ability to apply research skills. For three questions student respondents were neutral: 20% for “develop important new skills in my field”; 60% for “develop my writing skills”; and 60% for “develop my presentation skills.” Faculty members were pleased that no student disagreed or strongly disagreed with the above statements.

     Faculty members were pleased that 60% of students agreed or strongly agreed that they found courses intellectually challenging, with 40% neutral; that 80% found assignments relevant to their professional lives, with 20% neutral; that 80% were satisfied with the quality of instructors, with 20% neutral; that 100% strongly agreed or agreed that instructors were fair in their assessment and grading.

     Faculty saw two areas of concern. While 60 % either strongly agreed or agreed with the quality of academic advising, 20% disagreed (not satisfied). And while 60% strongly agreed or agreed with the supportiveness of faculty, 20% disagreed.

13) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

14) Please briefly describe how the program used the results.

Faculty members changed their course content and pedaogy. While faculty members were pleased with the assessment results, they discussed ways to increase the percentage of students reaching target level in all standards, and particularly in understanding of the subject, analysis and critical thinking, and synthesis of information. As a result of these discussions, faculty members changed some of the reading materials and offered more structured class discussions to enable students to better understand concepts and theories. Faculty members also modified their syllabi by breaking down assignments into smaller steps. They further changed the assessment assignments/activities to have clearer directions.

Faculty members also changed practices based on the student survey results. As a result of student responses on academic advising, the graduate chair contacted each MEd student in our program to see whether or not students were satisfied with the quality of the academic advising they were receiving. If students were not satisfied, the graduate chair discussed the students’ concerns with particular faculty, and if that did not improve the situation, guided students in following the department’s procedure of changing advisors. To address student respondents who were not satisfied with the supportiveness of faculty, faculty members made it a point to ask their students in their classes as well as their advisees how they--faculty members--could better support their students.

15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

NA

16) If the program did not engage in assessment activities, please explain.