Program: Undergraduate Office of Student Academic Services
Date: Mon Oct 07, 2013 - 10:15:04 am
1) Below are your program's student outcomes (SOs). Please add or update as needed.
1. Pre-business students will be able to identify admission requirements.
2. Business students will develop an individual academic plan specific to their needs.
2) Your program's SOs are published as follows. Please update as needed.
Student Handbook. URL, if available online: NA
Information Sheet, Flyer, or Brochure. URL, if available online: NA
UHM Catalog. Page Number: 181
Other: New Student Orientation (NSO)
Other: UHCC transfer workshops, College & Career Fair
3) Provide the program's activity map or other graphic that illustrates how program activities/services align with program student outcomes. Please upload it as a PDF.
- File (10/07/2013)
4) Did your program engage in any program assessment activities between June 1, 2012 and September 30, 2013? (e.g., establishing/revising outcomes, aligning activities to outcomes, collecting evidence, interpreting evidence, using results, revising the assessment plan, creating surveys, etc.)
No (skip to question 14)
5) For the period June 1, 2012 to September 30, 2013: State the assessment question(s) and/or assessment goals. Include the student outcomes that were targeted, if applicable.
1) Pre-business students will be able to identify admission requirements.
2) Business students will develop an individual academic plan specific to their needs.
6) State the type(s) of evidence gathered to answer the assessment question and/or meet the assessment goals that were given in Question #5.
During the second half of the reporting period, our unit moved from paper surveys to electronic surveys (surveyshare.com). Evidence gathered included student satisfaction surveys completed after individual academic planning appointments; individual admissions advising appointments; mandatory group advising sessions; and UHCC transfer workshops.
7) State how many persons submitted evidence that was evaluated. If applicable, please include the sampling technique used.
Approximately 800 student responses were received.
8) Who interpreted or analyzed the evidence that was collected? Check all that apply.
Faculty/staff committee
Ad hoc faculty/staff group
Director or department chairperson
Persons or organization outside the university
Students (graduate or undergraduate)
Dean or Associate Dean
Advisory Board
Other:
9) How did he/she/they evaluate, analyze, or interpret the evidence? Check all that apply.
Used quantitative methods on student data (e.g., grades, participation rates) or other numeric data
Used qualitative methods on interview, focus group, or other open-ended response data
Scored exams/tests/quizzes
Used a rubric or scoring guide
Used professional judgment (no rubric or scoring guide used)
External organization/person analyzed data (e.g., Social Science Research Institute)
Other:
10) For the assessment questions/goals stated in Question #5, summarize the actual results.
Approximately 90% of pre-business respondents indicated they could identify the admission requirements for our college, while approximately 80% of business respondents developed and submitted an individual academic plan specific to their needs.
11) What was learned from the results?
The change from paper to electronic surveys produced a higher response rate and validated our efforts at teaching admission requirements and academic planning in different modalities (e.g., 1:1 advising, group sessions, transfer workshops, recruitment efforts, and new student orientation sessions). In addition, all paper documents (e.g., major declaration, graduation petitions, course petitions, etc.) were also changed to web based, form fillable forms in an attempt to improve student access.
12) State how the program used the results or plans to use the results. Please be specific.
In an attempt to be more "user friendly", our unit moved away from paper based assessment activities and forms to on-line, form fillable versions. This shift reduced the volume of paper, duplication, and improved overall student use. Our goal is to continue the use of all web based instruments and documents.
13) Reflect on the assessment process. Is there anything related to assessment procedures your program would do differently next time? What went well?
We designed our assessment activities based on other customer service projects (e.g., mail based surveys after a purchase or medical appointment, on-line shopping experience, etc.). After a student receives a service (e.g., individual advising appointment, group session, etc.), the student receives an electronic survey instrument through their university email account within 24-hours. We found this very effective and plan to continue its use.