Program: Molecular Biosciences & Bioengineering (PhD)
Date: Tue Oct 07, 2014 - 8:00:10 pm
1) Below are your program's student learning outcomes (SLOs). Please update as needed.
The MBBE MS/PhD graduate program aims to cultivate creativity and provide an environment that promotes rigorous cutting-edge scientific inquiry, new discoveries and enthusiastic learning in an interdisciplinary atmosphere. MBBE student researchers aim to create new knowledge and technologies within the context of a broad-based education that engages and motivates them to be lifelong learners and contributors to society. MBBE graduates will carry a passion for learning and be active caretakers of the planet. The extent to which these goals are reached is assessed with the following Student Learning Outcomes (SLO's), in which students:
1. Are able to understand, describe and explain fundamental core STEM science concepts and have proven the ability to comprehend and convert these concepts into experimental approaches and hypothesis driven research on biological systems.
2. Wrote, contributed results and published articles, as primary author(s)/co-author(s), in peer-reviewed scientific journals of basic and applied molecular biosciences and bioengineering.
3. Present research at national and international conferences as evidenced by published abstracts and poster and/or oral presentations.
4. Can communicate orally and in writing in a clear, well-organized manner that effectively informs and clarifies scientific principles and laboratory techniques to others, as evidenced by provision of seminars, technical reports, dissertations or theses, providing details of scientific and scholarly activities.
5. Are well prepared for employment in the critically important and dynamic biotechnology, chemical and biosciences fields (government, academia, industry).
2) Your program's SLOs are published as follows. Please update as needed.
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online: Flyer
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
3) Select one option:
- File (03/16/2020)
4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.
5) Did your program engage in any program assessment activities between June 1, 2013 and September 30, 2014? (e.g., establishing/revising outcomes, aligning the curriculum to outcomes, collecting evidence, interpreting evidence, using results, revising the assessment plan, creating surveys or tests, etc.)
No (skip to question 14)
6) For the period between June 1, 2013 and September 30, 2014: State the assessment question(s) and/or assessment goals. Include the SLOs that were targeted, if applicable.
We did more than this. We changed the program's name and streamlined the confusing curricular options. These were long standing student requests. Our program had an unattractive name and we thought that a more modern name would be better.
Graduate student Survey 2014 - yet to be analysized
7) State the type(s) of evidence gathered to answer the assessment question and/or meet the assessment goals that were given in Question #6.
Comprehensive student survey and comprehensive student exit survey. The changes reflect what students wanted.
Various new assessment and progress forms.
8) State how many persons submitted evidence that was evaluated. If applicable, please include the sampling technique used.
As memory serves all students were surveyed in the normal survey. We had about 15 majors at a time. In addition, each student is given an exit interview by the advisor.
Graduate Student survey: 52 individuals replied to a basic multiple choice and short comment 10 question survey
9) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)
Ad hoc faculty group
Persons or organization outside the university
Advisors (in student support services)
Students (graduate or undergraduate)
Other: Graduate Chair
10) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other: Examining the best way to use this data
11) For the assessment question(s) and/or assessment goal(s) stated in Question #6:
Summarize the actual results.
Students did not generally volunteer that they didn't like the name of the program though a few would use a nickname. When asked, however, they almost unanimously preferred the new name, Molecular Biosciences and Bioengineering. All along the students and their advisor were flexible as to the electives in the specialties. Flexibility was insured by abolishing specialties.
Graduate Student Survey: Analysis is yet to be undertaken
12) State how the program used the results or plans to use the results. Please be specific.
It was a major thing to change the program's name. It was only accepted after a third meeting of the Council of Deans. We hope that with the new name enrollment will rise.
Identify specific issues regarding course requests, need for advising, information access regarding requirements.
13) Beyond the results, were there additional conclusions or discoveries?
This can include insights about assessment procedures, teaching and learning, program aspects and so on.
The main thing we always discover is that the best liked aspect of the program is spending a year in a mentor's laboratory doing a project.
Need for better access to advising, request for additional graduate professional development courses.
14) If the program did not engage in assessment activities, please explain.
Or, if the program did engage in assessment activities, please add any other important information here.
The program relies on the peer reviews by a board of faculty members who act anonymously and have no conflict of interest as its main assessment tools. As non-specialists, they have a mindset closer to the public's mindset. Secondary tools are acceptances into graduate schools especially those which off assistantships and professional schools which are highly selective.