Program: Learning Design and Technology (MEd)
Degree: Master's
Date: Sat Nov 14, 2020 - 8:13:10 pm
1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)
1. Students will be able to write a final research paper.
(1. Demonstrate comprehensive knowledge in one or more general subject areas related to, but not confined to, a specific area of interest.)
2. Students will be able to conduct an instructional design project.
(2. Demonstrate understanding of research methodology and techniques specific to one’s field of study., 3. Apply research methodology and/or scholarly inquiry techniques specific to one’s field of study.)
3. Students will be able to write a research proposal/an idea paper.
(4. Critically analyze, synthesize, and utilize information and data related to one’s field of study.)
4. Students will be able to design and execute a culminating research project.
(5. Proficiently communicate and disseminate information in a manner relevant to the field and intended audience.)
5. Students will be able to present their research at an online international conference.
(7. Interact professionally with others.)
6. Students will be able to complete and submit an IRB application.
(6. Conduct research or projects as a responsible and ethical professional, including consideration of and respect for other cultural perspectives.)
7. Students will be able to work in collaborative teams.
(7. Interact professionally with others.)
2) Your program's SLOs are published as follows. Please update as needed.
Student Handbook. URL, if available online: https://docs.google.com/document/d/1DO872VRFZE2QOXBMoShO1qPV-8_AQNZllJzvdc1VQ5w/edit
Information Sheet, Flyer, or Brochure URL, if available online: Presentation and handouts during New Student Orientation.
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: NA
Other: Advising Documents
3) Please review, add, replace, or delete the existing curriculum map.
- File (11/15/2020)
4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.
1-50%
51-80%
81-99%
100%
5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs
6) Did your program engage in any program learning assessment activities between November 1, 2018 and October 31, 2020?
No (skip to question 17)
7) What best describes the program-level learning assessment activities that took place for the period November 1, 2018 and October 31, 2020? (Check all that apply.)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate other pressing issue related to student learning achievement for the program (explain in question 8)
Other: Once a semester faculty meet to review every student on academic achievement and dispositions to make sure they are on track.
8) Briefly explain the assessment activities that took place since November 2018.
Assessment activities are similar to those conducted previously. These are mapped to ILO, SLO, and all guided by AECT standards which have remained unchanged as they had recently been remapped (November 2018). Specific assessments include:
Most significantly, faculty used data collected for the Department's self-study to ensure alignment of ILO, SLO, and AECT standards.
Collect/evaluate student work/performance to determine SLO achievement
· The program created a spreadsheet in Google to collect achievement data for SLOs.
· Data were recorded by instructors for course assignments tied to all SLOs
· Data were used for the Department's Self Study report
· Student reflections on most SLOs were collected through course assignments.
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
· The faculty teaching core courses continue to use assessment results to revise key assignments and content covered in courses.
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
· Care was taken to ensure the revised curriculum map (from November 2018) appropriately measured SLOs and there was no overlap among courses
· Faculty met to review student progress based on collected data
9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1: Semester review of student academic achievement and dispositions.
Other 2:
10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.
LTEC uses the assignments linked to the ILOs from 4 core courses for the past 5 academic years. ILO’s 2 and 3 are from first year courses and ILO’s 1, 4, 5 and 6 are from final year courses. ILO 7 is an assessment on student dispositions conducted at a faculty meeting at the end each academic semester (but only recorded at the end of the academic year). The total number of students during that period is N=109 broken down by cohort: 2015-2016=26; 2016-2017=23; 2017-2018=22; 2018-2019=22; 2019-2020=17
Year Entered |
Master’s Paper Proposal (Content) LTEC 687 |
ID Project (Research) LTEC 613 |
Idea Paper (Learning environments) LTEC 611 |
Master’s Project & Paper (Research Analysis) LTEC 690 |
Conference Presentation (Communication/Pedagogy) LTEC 690 |
IRB application & approval (ProfessionalKnowledge) LTEC 687 |
Dispositions |
2015-16 |
1.35 |
1.17 |
1.04 |
1.24 |
1.38 |
1.15 |
1.26 |
2016-17 |
1.29 |
1.21 |
1.03 |
1.36 |
1.44 |
1.35 |
0.90 |
2017-18 |
1.43 |
1.06 |
1.06 |
1.31 |
1.40 |
1.38 |
1.41 |
2018-19 |
1.86 |
1.50 |
1.17 |
1.64 |
1.80 |
1.80 |
1.89 |
2019-20 |
NA |
1.17 |
1.11 |
NA |
NA |
NA |
1.67 |
Mean |
1.36 |
1.24 |
1.07 |
1.33 |
1.45 |
1.36 |
1.43 |
0=Concerns 1=Acceptable 2=Excellent
11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:
12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other: Faculty discussion and disposition rubric
13) Summarize the results from the evaluation, analysis, interpretation of evidence (checked in question 12). For example, report the percentage of students who achieved each SLO.
The chart provided previously also indicates the outcomes. I've provided it here again. Please note that the data are aggregated. A review of data indicated nearly 100% of students achieving SLOs for each category with the following exceptions: (1) Concern about one 2019 cohort student not meeting ILO 2 & 3. (2) One 2019 cohort student not meeting ILO 3. (3) One 2017 and one 2018 cohort student not meeting ILO 4. In all cases, the students were advised on appropriate requirements. Similarly, students identified as having concerns with dispositions receive a letter and meet with the Department Chair for advising.
LTEC uses the assignments linked to the ILOs from 4 core courses for the past 5 academic years. ILO’s 2 and 3 are from first year courses and ILO’s 1, 4, 5 and 6 are from final year courses. ILO 7 is an assessment on student dispositions conducted at a faculty meeting at the end each academic semester (but only recorded at the end of the academic year). The total number of students during that period is N=111 broken down by cohort: 2015-2016=24; 2016-2017=30; 2017-2018=17; 2018-2019=21; 2019-2020=19
Year Entered |
Master’s Paper Proposal (Content) LTEC 687 |
ID Project (Research) LTEC 613 |
Idea Paper (Learning environments) LTEC 611 |
Master’s Project & Paper (Research Analysis) LTEC 690 |
Conference Presentation (Communication/Pedagogy) LTEC 690 |
IRB application & approval (ProfessionalKnowledge) LTEC 687 |
Dispositions |
2015-16 |
1.35 |
1.17 |
1.04 |
1.24 |
1.38 |
1.15 |
1.26 |
2016-17 |
1.29 |
1.21 |
1.03 |
1.36 |
1.44 |
1.35 |
0.90 |
2017-18 |
1.43 |
1.06 |
1.06 |
1.31 |
1.40 |
1.38 |
1.41 |
2018-19 |
1.86 |
1.50 |
1.17 |
1.64 |
1.80 |
1.80 |
1.89 |
2019-20 |
NA |
1.17 |
1.11 |
NA |
NA |
NA |
1.67 |
Mean |
1.36 |
1.24 |
1.07 |
1.33 |
1.45 |
1.36 |
1.43 |
0=Concerns 1=Acceptable 2=Excellent
14) What best describes how the program used the results? (Check all that apply.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other: Strategies for assisting students not meeting objectives.
15) Please briefly describe how the program used its findings/results.
These have been updated from the previous submission and are included in our Self-Study report.
While assessment results indicated that the vast majority of students in their final year were doing acceptable work in preparing their master’s proposal (SLO 1) and master’s project and paper (SLO 4) and IRB application (SLO 6), faculty believed there was room for improvement. Faculty teaching the final year practicum courses (LTEC 687 and LTEC 690) met frequently to revise the content of both courses. They added instruction to assist students in writing drafts of their master’s paper and they added milestones for the course that were integrated into a paper template so that weekly work on the milestones could be addressed in the paper. The course schedules were adjusted to allow ample time to share milestones with critical friends, receive feedback and then include their narratives in their master’s paper drafts. Also, the IRB application in eProtocol appeared to challenge students with finding appropriate pieces of their paper to fit into the eProtocol form. Therefore, questions or directions from the eProtocol form were highlighted in the paper template so that students would be able identify sections of their paper that were associated with a particular eProtocol item. As can be seen by the reported data, scores improved on these assessments following implementation of the changes.
Faculty also observed from the data that there was room for improvement in the instructional design aspects of the final master’s project. Students were not putting enough emphasis on the design process. Faculty revised the curriculum in first year courses to include more practical design thinking in assignments. Faculty also noted that students could improve their literature review, a section of their master’s paper. Discussion is ongoing about which first-year core course should include writing a literature review as an assignment. Work was also done after the last program review to revise all core assessments and align ILOs with new AECT standards, which serve as the program SLOs. Faculty also committed to trying to provide feedback to students within a two week timeframe.
16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.
We discovered that lots of data were in different locations and, particularly during the self-study, endeavored to consolidate the data. Also, overall, we realized that while there is room for specific improvement in data gathering, analysis, and student intervention, overall, LTEC runs a quality program with strong student success and a robust alumni association. Students often mention the strong feeling of ohana in our programs.
17) If the program did not engage in assessment activities, please justify.
The program did engage in assessment activities.