Program: Travel Industry Mgt (BS)
Date: Mon Dec 07, 2020 - 4:21:43 pm
1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)
1. Knowledge and Global Perspective: Identify and demonstrate skills relevant to the operational areas of hospitality and tourism management. For Tourism/Transportation Track: (Ta) Describe and assess the strengths and weaknesses of the major modes of transportation,(Tb) Explain transportation operations and management, as well as related global, environmental, technological, regulatory, and risk management issues faced by transportation professionals,(Tc) Describe the elements of the tourism system and explain their interrelationships,(Td) Identify and define sustainability issues for the tourism system and explain how they can be addressed. For Hospitality Track:(Ha)Analyze external and internal environmental factors that affect hospitality organizations, (Hb) Identify, explain, and apply the management concepts, principles, and processes in operational areas of hospitality organizations.
(1b. Specialized study in an academic field)
2. Effective Communication: (a)Demonstrate effective written communication skills,(b)Demonstrate effective oral communication skills.
(1a. General education, 2c. Communicate and report, 3d. Civic participation)
3. Critical Thinking: (a)Analyze situations and develop alternative options to resolve identified issues, (b)Synthesize appropriate information to develop reliable, valid, and logical arguments.
(1a. General education, 2a. Think critically and creatively, 2b. Conduct research, 3a. Continuous learning and personal growth)
4. Leadership and Teamwork: (a)Demonstrate effective leadership skills or traits of a leader,(b)Work productively, respectfully, and professionally as a team member.
(1a. General education, 3d. Civic participation)
5. Ethics and Stewardship: (a)Apply ethical behavior,(b)Evaluate the importance of host cultures to the global travel industry and utilize sustainable practices.
(1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history, 3b. Respect for people and cultures, in particular Hawaiian culture, 3c. Stewardship of the natural environment)
2) Your program's SLOs are published as follows. Please update as needed.
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number: http://www.catalog.hawaii.edu/schoolscolleges/tim/TravelIndustryManagement.html
Course Syllabi. URL, if available online: Program Learning Objectives are reproduced in all TIM course syllabi.
3) Please review, add, replace, or delete the existing curriculum map.
- File (12/07/2020)
4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.
5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs
6) Did your program engage in any program learning assessment activities between November 1, 2018 and October 31, 2020?
No (skip to question 17)
7) What best describes the program-level learning assessment activities that took place for the period November 1, 2018 and October 31, 2020? (Check all that apply.)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate other pressing issue related to student learning achievement for the program (explain in question 8)
8) Briefly explain the assessment activities that took place since November 2018.
The TIM School seeks to prepare students to succeed in managerial and executive positions in Hawaii's largest industry. We therefore consider students' performance as interns to be a valid and reliable indicator of our effectiveness in preparing students to succeed as professionals in the travel industry. Accordingly, since our last assessment report, we have begun an initiative to evaluate the performance of TIM School undergraduate students enrolled in our internship courses. Our initial focus has been the 58 students enrolled in Internship II (TIM 200) during Spring 2019 and Fall 2019. Student performance in this internship course was measured by asking each intern's employer to complete a standardized evaluation form designed and implemented by Deborah Fitzgerald, TIM School Director of Internship/Career Development. The form solicited employers' assessments of students' performance in terms of "relations with others," "attitude-application to work," "judgement," "dependability," "ability to learn," "quality of work," "initiative," "reaction to criticism," "ethical issue recognition," "evaluation of different ethical perspectives," and "overall performance," on 5-point scales with labels customized to each point of each dimension. For example, in the case of the "ability to learn" dimension, the labels, from low to high, were, "very slow to learn," "slow to learn," "satisfactory in learning," "learn readily," and "learns very quickly," whereas in the case of the "attitude-applcation to work dimension," the labels, from low to high, were, "poor attitude," "fair attitude," "satisfactory attitude," "very good attitude," and "outstanding attitude." In addition, students' attendance was rated as either, "regular" or "irregular," and their punctuality was rated as either, "outstanding" or "irregular." Employers were also asked to comment on the "strengths or weaknesses" of students. Finally, employers were asked to indicate whether they would "recommend this student for future employment in your own or another firm" and whether they felt students had "potential for promotion," and provide comments explaining their views.
We have also made some efforts to improve our undergraduate students' writing abilities. Since our 2018 assessment of student performance found that students enrolled in our undergraduate capstone course, TIM 431, had writing skills lower than expected, Assessment Coordinator Dan Spencer prepared and distributed to all TIM School instructional faculty a PowerPoint slide deck designed to improve students' technical writing abilities. This slide deck will be sent via email to Monica Stitt-Bergh immediately after the submission of the present report.
9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.
As explained above, evaluation forms that assessed the perfornance of 58 students enrolled in Internship II (TIM 200) during Spring 2019 and Fall 2019 were completed by the employers who supervised these students. No sampling was conducted; all 58 student enrollees were evaluated.
11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)
Ad hoc faculty group
Persons or organization outside the university
Advisors (in student support services)
Students (graduate or undergraduate)
Other: The Assessment Coordinator, Dan Spencer, completed the analysis today. He will share the results with the entire TIM School faculty, as well as the Shidler College of Business' Assurance of Learning Coordinator.
12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
13) Summarize the results from the evaluation, analysis, interpretation of evidence (checked in question 12). For example, report the percentage of students who achieved each SLO.
Most broadly, employers' evaluations of student interns' "overall performance" ranged from 3 ("acceptable") to 5 ("outstanding") and averaged 4.69. No students were rated as 1 ("fair") or 2 ("poor"). On the other 5-point scales, such as "Quality of Work" and "Initiative," responses ranged from from 3 to 5 and averages ranged from 4.31 in the case of "Judgement" to 4.74 in the case of "Ability to Learn". Similarly, 98.3% of employers indicated that students' attendance was "Regular" as opposed to "Irregular," 100.0% indicated that students' punctuality was "Outstanding" as opposed to "Irregular," 100.0% indicated that they would "recommend this student for future employment in your own or another firm," and 100.0% affirmed that students had "potential for promotion."
While we find these results encouraging, they are only preliminary and we feel it is necessary to gain a more comprehensive understanding of the data by also analyzing the many comments written by employers. We will also enter and analyze data from evaluations of Internship III (TIM 300) for Spring 2019 and Fall 2019 that utilized the same evaluation forms. The results of these analyses are forthcoming.
Finally, we would like to caution that the preliminary results reported above, to some unknown extent, may be positively inflated by the fact that evaluations, in most cases, culminated in discussions between employers and interns, evinced by the fact that 88.7% of employers stated that they had discussed their evaluations with interns and nearly all interns signed the form. Since some people are disinclined to criticize others in face-to-face meetings, the social context of the evaluations may have resulted in more positive results than would otherwise have been the case. In addition, one needs to keep in mind that interns were evaluated in 2019, a year of very low unemployment, and employers may have felt inclined to positively evaluate interns to increase the probability of their applying for permanent employment with their firm in the future. Thus, it would be interesting to compare internship evaluations for 2019 with those for 2020, when unemployment rates have been historically high.
14) What best describes how the program used the results? (Check all that apply.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other: Since the results were produced only today, their utilization, of course, has yet to occur. Again, the results will be widely shared.
15) Please briefly describe how the program used its findings/results.
As indicated above, application of findings is forthcoming.
16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.
As indicated above, our results were completed earlier today and at this point are only preliminary. Further analysis, especially of employers' written comments, may reveal additional insights.
17) If the program did not engage in assessment activities, please justify.