Unit: Medical Technology
Program: Medical Technology (BS)
Degree: Bachelor's
Date: Tue Sep 22, 2020 - 7:45:48 am

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. Meet the MLS entry-level competency skills for each discipline as defined by the Curriculum subcommittees. Skills include pre-analytical (e.g., specimen procurement, preparation, equipment calibration), analytical (e.g., analysis, instrument operation, quality control), and post-analytical (e.g., reporting, follow up).

(1b. Specialized study in an academic field, 2a. Think critically and creatively, 2c. Communicate and report, 3d. Civic participation)

2. Demonstrate MLS entry-level knowledge of each discipline as specified by the Curriculum subcommittees. Knowledge include principles of analyses, sources of errors, correlations, interpretations, managing unexpected outcomes, communications.

(1b. Specialized study in an academic field, 2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report, 3a. Continuous learning and personal growth, 3d. Civic participation)

3. Demonstrate MLS entry-level professionalism as specified by the Curriculum subcommittees. Professional traits include reliability, flexibility, integrity, ethics, initiative, and interpersonal relations.

(1b. Specialized study in an academic field, 2a. Think critically and creatively, 2c. Communicate and report, 3a. Continuous learning and personal growth, 3b. Respect for people and cultures, in particular Hawaiian culture, 3d. Civic participation)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: medtech.jabsom.hawaii.edu
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online: medtech.jabsom.hawaii.edu
UHM Catalog. Page Number: https://manoa.hawaii.edu/catalog/schools-colleges/medicine/medt/
Course Syllabi. URL, if available online:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2020:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

No
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between November 1, 2018 and October 31, 2020?

Yes
No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period November 1, 2018 and October 31, 2020? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate other pressing issue related to student learning achievement for the program (explain in question 8)
Other: Applied for accreditation renewal.

8) Briefly explain the assessment activities that took place since November 2018.

To prepare the Self Study report for continued accreditation, thje program assessed various parameters including curriculum, faculty qualifications and continuing education, student progress (admissions, graduation rate, certification exam pass rate), evaluation mechanisms (including affective evaluations), student evaluations, graduate survey, employer survey, and others.

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Graduate survey (2018): 3 (75%)

Course evaluations: varies (not made public)

11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

13) Summarize the results from the evaluation, analysis, interpretation of evidence (checked in question 12). For example, report the percentage of students who achieved each SLO.

Several meetings were convenened consisting of faculty members. Summary was reported to the Advisory Committee which consists of KCC and UHM program officials, clinical liaisons, KCC and JABSOM administration, others. In general, the program achievement was deemed satisfactory. In 2020, the program was granted continued accreditation for 10 years (maximum) by the National Accrediting Agency for Clinical Laboratory Sciences.

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

15) Please briefly describe how the program used its findings/results.

Following is exerpted from the 2019 NAACLS accreditation self-study report regarding program evaluation. Documents submitted related to this sections were (or already available to NAACLS via annual reports, or presented to the site visitors):

  • ASCP national certification exam statistics
  • Admissions and Graduation statistics
  • Minutes of joint MLT/MLS Advisory Committee meeting
  • Minutes of regular faculty meetings
  • Entry level competency statements: career enry level objectives (used to construct checklists)
  • Other survey summaries

Additional documents submitted to NAACLS with this section include:

  • Course syllabi with SLOs
  • MEDT 591 "terminal" competency checklists for each lab disciplnes and professional traits
  • A summary of recent graduate feedbacks
  • A summary of employer feedbacks
  • eCAFE and non-eCAFE student feedbacks for campus courses

 

 

Standard VIII.C. Curriculum Requirements – Evaluations

 

C.1

Program Evaluations. Several tools are used to assess program effectiveness. Data are also submitted in the annual NAACLS survey.

 

(a) ASCP-BOR MLS Exam Scores: The Program Director accesses the ASCP BOR MLS exam scores of graduates who challenged the exam on a regular basis. Since MEDT 591 Clinical Training rotations are generally completed in November, exam scores are usually available near end of the year. As indicated in Standard II.B., overall percentage of students who pass the exam within one year is above the NAACLS approved benchmark (AY2013-2017 first time pass rate = 91%; overall pass rate 95%; 2016-17 pass rate = 100%). Summarized results with names redacted are annually reviewed by faculty members and Clinical Liaisons to determine areas that must be improved. Exam pass rates are presented to the Joint MLT-MLS Advisory Committee as well.

 

(b) Admission and Graduation Statistics: The Admissions Committee and the Program Director compile these data periodically. Since Fall 2018 when admission of second-degree students began, enrollment has nearly doubled. As indicated in Standard II.B., overall graduation rate of our students is above the NAACLS approved benchmark at 84%. If “final half of program” is defined as the final two semesters, the rate is nearly 100%. We are closely monitoring the progress of second-degree students to ensure their successful completion of the program. First cohort is expected to graduate in May 2019. Summary of anonymous surveys are attached under Standard II.C.

 

(c) Graduate Feedback: Near the end of MEDT 591 Clinical Training, a feedback form is sent to students. Feedbacks are summarized by the Program Director and shared with faculty, Clinical Liaisons, and the Joint Advisory Committee. Status of employment as MLS is one of the items included in the survey. According to the AY16-17 survey, 100% of the respondents reported employment. A sample is attached (Standard-VIII-GraduateFeedback.pdf). Beginning Fall 2018, an electronic format is used to solicit student feedbacks to improve return rate and ensure anonymity.

 

(d) Employer (Affiliate) Feedback: Near the end of MEDT 591 Clinical Training, a feedback form is sent to Clinical Liaisons. Feedbacks are summarized by the Program Director and shared with faculty, Clinical Liaisons, and the Joint Advisory Committee. A sample is attached (Standard-II-EmployerSurvey.pdf).

 

(e) Joint MLT-MLS Advisory Committee: The Committee convenes annually. The Committee is asked to discuss program effectiveness, and to provide guidance on future directions. A sample minutes of the meeting is attached (Standard-VII-AdvisoryCmteMinutes.pdf).

 

(f) Course Evaluations: Near the end of each semester, students are asked to submit course and faculty evaluation using the eCAFE system. The electronic system is anonymous and provides feedback directly to the faculty. The system also provides UH campus and JABSOM-wide statistics for comparison. Faculty members are not required to submit a copy to the Department Chair, but are encouraged to include the data in their annual faculty evaluation to be submitted to the Dean. Examples of eCAFE report are attached below. Starting Fall 2018, UHM switched to a similar but new system called the Course Evaluation System (CES).

 

(g) Surveys: As indicated above, we have used anonymous electronic surveys to solicit feedback from students (e.g., second-degree), Affiliates, and graduates. This anonymous format is simpler to complete, thus should increase the response rate.

 

Other information:

 

Course Syllabus: At on-campus level, the faculty member assigned to the MEDT course develops the course objectives (student learning outcomes). Course syllabus with course objectives are provided to students at the beginning of the semester. One way to ensure linking of objectives to assessment tools is to use an exam grid. An example from MEDT 451 that contains the schedule (and frequency of exams), policies, objectives (including affective domain), and an exam grid is attached (Standard-VIII-451Syllabus.pdf).

 

Exams: Mid-semester assessments and student works that are submitted are used to make formative assessment of student progress. Usually one or more mid-semester exams are administered, along with a number of other assignments (e.g., homework, reports, lab write ups, online quiz on Laulima). Detailed feedback to students are given as necessary. Summative assessment is made based on the final exam score and other data, so that a grade can be assigned. Grading policy is included in the course syllabus. For laboratory courses, practical exams are also administered.

 

MEDT courses on credit/no credit (Cr/NC) basis use student activity records to determine the final grade. For example, in MEDT 481 (Professional Issues), at least 3 oral presentations are required. Presentations are assessed on criteria such as effectiveness of verbal/non-verbal communication, precise and accurate presentation, relevancy, and ability to answer audience questions, giving and receiving feedback, etc.

 

Advising: Faculty members are available for course-specific consultation during the semester. Students are also urged to seek academic advising at least once during the semester. Students in their final semester at UHM must submit graduation applications signed by an academic advisor.

 

Program Level Assessment: The general Student Learning Outcomes (SLO) along with program assessment data are updated bi-annually. Data are accessible from the UHM Assessment Office: https://manoa.hawaii.edu/assessment/update2/view.php (Click on Medical Technology). SLO is also published in the general Program Brochure.

 

Entry Level Competency: Entry level competency has been defined by the Department’s Curriculum Committee and its subcommittees. It is used to formulate the MEDT 591 objectives. The objectives address cognitive, psychomotor, and affective domains. They are periodically reviewed and updated by ad hoc committees, and changes are incorporated into the check sheets. During MEDT 591 Clinical Training, supervising faculty assigned by the Clinical Liaison assesses student performance and completes the evaluation check sheet. Completed check sheets are reviewed and signed by the student and the Clinical Liaison prior to submission to the Program Director who is responsible for assigning Cr/NC grade. During clinical training, students have access to the online course management system (Laulima), where a final comprehensive exam is administered. A sample of MEDT 591 objectives and check sheets (CHUB) are attached (Standard-VIII-591CheckListCHUB.pdf).

 

 

C.2

MEDT course examinations used to assess student performance are related to the stated course objectives. MEDT course syllabi with calendars, grading policy, etc. are attached below. MEDT 451 syllabus has an exam grid that shows the relationship between the course objectives and the relative number of questions per question level asked on the final exam (Standard-VIII-451Syllabus.pdf). Full course files for MEDT courses are available on site.

 

All Clinical Affiliates use the common entry-level objectives and check list to assess and document student performance. Areas of assessment are (a) Professional Traits, (b) Immunology, (c) Clinical Chemistry, (d) Coagulation, Hematology, Urinalysis, Body Fluid Analysis (CHUB), (e) Blood Bank, and (f) Clinical Microbiology. Checklists are attached below.

 

The UHM Medical Technology program maintains high performance on important outcomes: national certification exam pass rate, graduation rate, and graduate placement rate. Opening of the second-degree admission route has significantly increased the enrollment. The first cohort of students on this route is expected to graduate in May 2019. Progress of second-degree students will continue to be monitored.

 

MEDT Courses: Faculty responsible for MEDT courses reviews the results of assessments in his/her own courses (exams, homework, etc.) and makes appropriate adjustments to the courses. Summary data from ASCP certification exams are also used to identify subject areas that require attention, and modifications made in specific courses.

 

Course and Instructor Evaluations: Near the end of each semester, students are asked to complete an online course and instructor evaluation called the eCAFE. The system tallies the responses anonymously and returns the data to faculty after the course grade has been submitted. On eCAFE, there are a few system generated questions, to which the faculty member adds other questions. Faculty members use student feedback to make adjustments to their respective courses as appropriate. Attached below (Standard-VIII-ClassEval151.pdf and others) are examples of eCAFE report for MEDT classes. Data under “University of Hawaii at Manoa” are UHM campus-wide statistics; “School of Medicine” data are summary statistics of all courses offered through JABSOM; “Allied Medical Sciences” data are composite statistics of all MEDT courses; “CRN” data are course specific statistics. As stated above under C.1., UHM switched to a new CES system of course/instructor evaluation in Fall 2018. New data will be available on site.

 

Graduate Feedback: At the end of MEDT 591 Clinical Training, students are asked to complete a feedback form. This form covers MEDT 591 experience as well as overall program assessment. Program level SLO are reproduced on the form as a reminder. A summary of student feedback from AY2016-2017 is attached (Standard-VIII-GraduateFeedback.pdf).

 

 

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

None.

17) If the program did not engage in assessment activities, please justify.

N/A