Unit: Learning Design and Technology
Program: Learning Design and Technology (PhD)
Degree: Doctorate
Date: Thu Nov 15, 2018 - 11:34:38 am

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. Students can demonstrate theoretical and conceptual knowledge in the broad issues of learning design and technology. (Knowledge and Understanding ILO 1.1)

(1. Demonstrate comprehensive knowledge in one or more general subject areas related to, but not confined to, a specific area of interest.)

2. Students demonstrate knowledge of the various research approaches used in the learning design and technology field and are able to develop a research proposal using one or more of those methods. (Knowledge and Understanding ILO 1.2)

(1. Demonstrate comprehensive knowledge in one or more general subject areas related to, but not confined to, a specific area of interest., 2. Demonstrate understanding of research methodology and techniques specific to one’s field of study.)

3. Students can apply their knowledge of the field and critical thinking to a design project and can present possible solutions to problems relevant to learning design and technology. (Intellectual and Applied Skills, ILO 2.1)

(4. Critically analyze, synthesize, and utilize information and data related to one’s field of study.)

4. Students are able to integrate appropriate tools, concepts and principles in learning design and technology to collect, analyze and synthesize both qualitative and quantitative data related to the field. (Intellectual and Applied Skills, ILO 2.2)

(3. Apply research methodology and/or scholarly inquiry techniques specific to one’s field of study.)

5. Students are able to communicate information relevant to the field both orally and in written form using appropriate tools and in a manner consistent with accepted professional and institutional guidelines and procedures. (Communication Skills, ILO 3.1)

(2. Demonstrate understanding of research methodology and techniques specific to one’s field of study.)

6. Students are able to design and conduct research in the field in a manner that is responsible and ethical and respects the cultural perspectives of others. (Professional Responsibility, ILO 4.1)

(6. Conduct research or projects as a responsible and ethical professional, including consideration of and respect for other cultural perspectives.)

7. Students demonstrate the personal and professional characteristics expected of those working in the field of learning design and technology. (Professional Responsibility, ILO 4.2)

(5. Proficiently communicate and disseminate information in a manner relevant to the field and intended audience., 7. Interact professionally with others.)

2) Your program's SLOs are published as follows. Please update asneeded.

Department Website URL:
Student Handbook. URL, if available online: https://docs.google.com/document/d/18H8pC8GY1i1LARaeaPu2uQB4KRAXka0JJuM7ZFWc7m4/edit
Information Sheet, Flyer, or Brochure URL, if available online: Part of orientation presentation for all new students
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: As appropriate for each course
Other:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2018:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

No
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between June 1, 2015 and October 31, 2018?

Yes
No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period June 1, 2015 to October 31, 2018? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
No (skip to question 17)
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

8) Briefly explain the assessment activities that took place.

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)

  • The program revised its SLOs and curriculum map

  • The program determined signature assessments, both formative and summative) to assess SLO achievement

Collect/evaluate student work/performance to determine SLO achievement

  • Data were collected and entered into a google document in all but one of the core courses.

  • Faculty on dissertation committees completed a rubric for proposal and dissertation defenses.

Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)

 

  • The faculty teaching the research course sequence met and revised the key assignments and content covered in each

  • The faculty developed one  advanced methods course to address needs of students interested in indigenous research

  • The faculty began development of a design-based research (DBR) course to address needs of students pursuing DBR in both projects and dissertations

  • Faculty reviewed data from assessments at the annual retreat and recommended some additional emphasis on certain aspects in core courses (e.g. more emphasis on APA in earlier courses, mode practice with data analysis in methods courses)

 

 

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

All students must complete key assignments as well as comprehensive exam, proposal defense, and dissertation defense.  All students are reviewed by the faculty annually for professional dispositions. The number of students submitting evidence varies by where they are in the program.  For example, 63 student work samples were collected for one key assessment that occurs at the beginning of the program. However, only 8 dissertation review sheets were collected when students defended their final dissertations. Missing data from some of the key courses reflects the retirement and the resignation of two faculty who did not provide data.

The LTEC program operates as a cohort program.  The following cohorts are included in terms of this assessment report.

  • 2015 cohort (including all students who transferred from EDUC to LTEC in 2015) = 53 students. 2 016 cohort = 15 students. 2017 cohort = 6 students.  2018 cohort = 6 students.
  • 47 of 68 student conference proposal key assignments (LTEC 750) were scored and recorded from 2015 and 2016 cohorts.
  • 63 of 74 student qualitative analysis key assignments (LTEC 667) were scored and recorded from 2015, 2016 and 2017 cohorts
  • 63 of 68 student quantitative analysis key assignments (LTEC 668) were scored and recorded from 2015 and 2016 cohorts
  • 53 of 68 student design project key assignments (LTEC 701) were scored and recorded from 2015 and 2016 cohorts
  • 59 of 68 student literature review key assignments (LTEC 750) were scored and recorded from 2015 and 2016 cohorts
  • 62 of 68 student CITI scores were recorded (LTEC 760) from 2015 and 2016 cohorts
  • 73 of 74 students received disposition scores for their first year in the program from the 2015, 2016 and 2017 cohorts and 62 have Year 2 scores from 2015 and 2016 cohorts.
  • 22 of 68 students from the 2015 and 2016 cohorts had scores recorded for passing their comprehensive examinations.
  • 12 of the 53 students in the 2015 cohort have completed their proposals and have scores.
  • 8 of the 2015 cohort members completed their dissertations and have rubric scores recorded.
  • All faculty submitted syllabi for review.
  • All 4 doctoral students responded to the Alumni Survey sent for 2016-17 graduates
 

11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

13) Summarize the results of the assessment activities checked in question 7. For example, report the percentage of students who achieved each SLO.

Rubrics were developed for most of the key assessments using a 6 level rubric. In general 0 indicated the knowledge or skill we were seeking was absent, 1 (weak) 2 (needs some improvement) 3 (satisfactory) 4 (good) 5 (excellent).

Formative assessment data showed the following:

  • Scores on the literature review assessment (LTEC 750 - SLO 1) showed 83% meeting expectations (score of 3, 4, or 5) with 50% scoring 4 or 5.
  • Scores on the conference proposal and presentation (LTEC 750 - SLO 5) showed 96% meeting expectations (scores of 3, 4, 5) with 70% achieving a score of 5.
  • Scores on the ethics knowledge (CITI)  assessment (LTEC 760 - SLO 6) showed 100% passing.
  • On the design project assessment (LTEC 701 - SLO 3) 100% had satisfactory scores (3, 4, 5) but only 2% achieved a score of 5.
  • On the qualitative data analysis assessment (LTEC 667 - SLO 4), 91% received satisfactory scores (3, 4, 5) but 45% were satisfactory (3) and only 18% excellent (5).
  • On the quantitative data analysis assessment (LTEC 668 - SLO 4), 95% received adequate scores (3,4,5) with over half receiving a score of 4 (57%).

Summative assessment data showed the following:

  • Scores on the comprehensive examination showed 100% scoring 3, 4, or 5 on the rubric. 23% scored satisfactory (3), 32% good (4), and 45% excellent (5).
  • Those completing their proposals showed that 50% were rated as excellent (5), 33% good (4) and 17% satisfactory (3).
  • For the dissertation oral defense, 50% were rated as good (4) and 50% excellent (5).
  • For the dissertation document, 63% scored good (4) and 37% excellent.

In terms of professional dispositions;

Year 1 ratings showed 73% on target (3 - 30%, 4 - 14%, or 5 - 29%). 26% needed improvement (2) and 1 student scored as weak (1).

Year 2 ratings while still showing 73% on target (3,4,5)  there was a little improvement with fewer scoring 3 and more scoring a 4 (29% scoring 5, 23% scoring 4, and 21% scoring 3) an no one scoring below 2, althugh 27% were at a level 2 - needs improvement.

 

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

15) Please briefly describe how the program used the results.

Assessment results were used in the determination to require GRE scores for all applicants for the PhD program, including students who completed our Masters program as scores on such assessments as the literature review and proposal indicated students needed stronger writing skills to be successful.

Assessment results indicated that students needed more practice with data analysis and interpretation prior to reaching the dissertation stage and courses (665, 667, 668)  were revised accordingly.

Time from completion of coursework to proposal defense was examined and in order to move students more quickly to dissertation phase, LTEC 760 was revised and a decision made to reduce section sizes to increase opportunities for hands on support for proposal development.

Alumni survey results indicated they would like more diverse electives to choose from and that more conference travel support is needed.  Faculty developed 2 new courses that will be elective options and are actively seeking ways to increase the pool for student travel support.

 

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

Although student assessment is constant, faculty need to be reminded that program assessment needs to be constant as well. If data are not entered at the end of each semester, problems may occur with missing data because faculty may have retired or resigned and the department does not have access to the assignment results.

While the entire faculty review data from surveys and discuss student dispositioins, It may be a good practice to involve more faculty in the review of data that come from courses. Currently, only faculty teaching doctoral courses are involved in reviewing class data. 

Since the LTEC PhD program is new, this has been the first program assessemnt that includes the review of three years of data. The results are very positive and has reinforced the great work achieved by our students and faculty.

17) If the program did not engage in assessment activities, please justify.

Program assessment has been conducted.