Unit: Psychology
Program: Psychology (BS)
Degree: Bachelor's
Date: Wed Nov 18, 2020 - 6:05:27 pm

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. Psychological knowledge: Synthesis and Application: Students will be able to describe key concepts, principles, and overarching themes in psychology; develop a working knowledge of psychologys content domains (e.g., cognition and learning, developmental, biological, and sociocultural, etc.); and describe applications of psychology.

(1b. Specialized study in an academic field, 2a. Think critically and creatively, 2b. Conduct research)

2. Scientific inquiry and critical thinking: Students will be able to use scientific reasoning to interpret psychological phenomena; demonstrate psychology information literacy; engage in innovative and integrative thinking and problem solving; interpret, design, and conduct basic psychological research; and incorporate sociocultural factors in scientific inquiry.

(1b. Specialized study in an academic field, 2a. Think critically and creatively, 2b. Conduct research)

3. Ethical and social responsibility in a diverse world: Students will be able to apply ethical standards to evaluate psychological science and practice; build and enhance interpersonal relationships; and adopt values that build community at local, national, and global levels.

(2b. Conduct research, 3a. Continuous learning and personal growth, 3b. Respect for people and cultures, in particular Hawaiian culture, 3c. Stewardship of the natural environment)

4. Communication: Students will be able to demonstrate effective writing for different purposes; exhibit effective presentation skills for different purposes; and interact effectively with others.

(1a. General education, 1b. Specialized study in an academic field, 2c. Communicate and report)

5. Professional development: Students will be able to apply psychological content and skills to career goals; exhibit self-efficacy and self-regulation; refine project- management skills; enhance teamwork capacity; and develop meaningful professional direction for life after graduation.

(1b. Specialized study in an academic field, 2c. Communicate and report, 3d. Civic participation)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: https://psychology.manoa.hawaii.edu/student-learning-outcomes/
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other: SLOs are on all course syllabi and available to students in each class, or upon request.

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2020:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

No
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between November 1, 2018 and October 31, 2020?

Yes
No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period November 1, 2018 and October 31, 2020? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate other pressing issue related to student learning achievement for the program (explain in question 8)
Other:

8) Briefly explain the assessment activities that took place since November 2018.

The Department of Psychology's assessment efforts have included the following items:
 
Entrance and Exit Surveys: We have continued our assessment of the Psychology Department’s BA and BS programs by evaluating entrance and exit surveys of Psychology majors, first when they declare psychology as their major and then again when they file for graduation. The survey was previously revised in 2014 and again in 2016. In our most recent assessment (2018) we noted that the (needed) changes in the survey created a challenge as students who graduated and exited the program were given a different exit survey from the original entrance survey that they would have completed 2-3 years previously. We are now at a point where this issue has been minimized. An additional challenge that was identified in our previous assessment was our ability to distinguish between BA and BS students in the survey. Our revisions to the survey now enable us to separate BA from BS students in the current assessment report (note, student status can and does fluctuate over time). It should be noted that the validity of survey data is questionable in general, and therefore the Department has continued efforts to assess discipline-based student writing in capstone courses.
 
Capstone Papers: Over the present assessment period we have collected writing samples from students enrolled in upper division writing intensive courses (i.e., PSY 4X9). These writing samples have been assessed by the Undergraduate Studies Committee (composed of five faculty members). In total, the faculty committee assessed 25 writing samples from a variety of classes. This was our second time conducting this type of assessment and the committee felt that while helpful and informative to see a range of samples across the discipline, that the rubric needs refinement. The Undergrad Studies committee is planning on refining and improving this rubric for the next assessment period. While this will create challenges to compare performance across different assessments, an improved assessment tool is ideal as we move forward.
 
Discussion of Psychology Minor Assessment: In the Fall of 2019 the Psychology Department began offering a Psychology minor. Unfortunately, it is not possible to identify minor students until after they have completed the minor. Therefore, an entrance/exit survey will not work to assess these students. The capstone paper assessment is also not a viable option as these students are not required to complete a capstone course in order to complete the minor. The Undergraduate Studies Committee is discussing options on how to best assess the progress of students in our minor.
 
Online learning and Psychology Advising Office Assessment: The swift transition to online learning after the coronavius pandemic created many questions as to the continued high quality of both the Psychology program and the advising offered by the Psychology Advising Office. In the summer of 2020 we surveyed our students in order to gauge student feelings on online courses and advising.

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1: Student survey of perceptions of both online learning and advising
Other 2:

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Entrance and exit surveys:
 
The entry survey is taken immediately after a student declares psychology as their major. This survey is typically conducted at the location where a student declares the major, which is the Psychology Advising Office in most occasions, sometimes at ACCESS for rare conditions, and is currenlty administered online due to the coronavirus pandemic. The exit survey is administered to all graduating Psychology majors. The survey questions directly relate to the Psychology Department’s SLOs, and therefore assesses the extent to which Psychology undergraduate students perceive that SLOs were enhanced by completing the major. Since our most recent assessment (2018), a total of 96 Psychology students completed the entrance survey (11 BS students and 85 BA students) and 54 completed the exit survey (10 BS students and 44 BA students).
 
Capstone Papers:
 
Writing samples from BS and BA students from different capstone courses (PSY 4X9) were collected and assessed by the Undergraduate Studies Committee. This was a random sampling and faculty members did not distinguish between BS and BA students in their class. Writing samples were anonymized and randomly distributed to the five faculty members on the Undergraduate Studies Committee. A total of 25 writing samples were collected across different semesters and analyzed (N = 25 students). 
 
Online learning and advising survey:
 
Due to the swift change to online learning in the Spring of 2020 the department realized the importance of surveying our undergraduate students about online course delivery and virtual advising. To this end, a survey was adminstered in the Summer of 2020. A total of 72 students responded.

11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

13) Summarize the results from the evaluation, analysis, interpretation of evidence (checked in question 12). For example, report the percentage of students who achieved each SLO.

Please find below descriptions and summary tables for different programmatic assessment activities that we have conducted:
 
Entrance and Exit Surveys
 
Mean scores are presented below with possible scores ranging from 1 (Strongly Agree) to 5 (Strongly Disagree) across SLO items. The average scores seem to indicate that students are meeting or exceeding expectations in all categories assessed. It is important to note that while separate assessment reports are required for the Psychology BS and BA programs, both are reported below given that they are both Psychology majors and that our courses are not separated based on program status (i.e., all courses count equally towards the BS and BA programs).
UHM Undergraduate Psychology Entrance Survey. All respondents used a scale (1-5) ranging from Strongly agree to Strongly disagree.
 

1. I currently feel knowledgeable about psychological concepts, theoretical perspectives, research findings, or historical trends. 

Program

N

M (SD)

BA

85

2.15 (.92)

BS

11

2.27 (1.19)

 

2. I have a background in basic research methods, including research design, data analysis, and interpretation, and able to use critical and creative thinking in solving problems. 

Program

N

M (SD)

BA

85

2.07 (1.03)

BS

11

2 (1.26)

 

3. I understand how psychological concepts can be used in everyday life and organization as well as the ethical complexities involved in applying psychology to social situations 

Program

N

M (SD)

BA

85

1.77 (1.18)

BS

11

2.18 (1.25)

 

4. I currently feel comfortable with my communication skills including writing, interpersonal and oral skills, showing quantitative literacy and collaborating with others. 

Program

N

M (SD)

BA

85

1.99 (1.18)

BS

11

2.09 (1.3)

 

5. I feel capable of understanding mental processes, applying effective strategies for self-management, including self-regulation, and integrity. I also feel capable of using psychological skills, values and information for my future career. 

Program

N

M (SD)

BA

85

1.96 (1.23)

BS 

11

1.9 (1.28)

 

UHM Undergraduate Psychology Exit Survey (upon graduation) 

 

1. As a result of majoring in Psychology, I feel more knowledgeable in psychological concepts, theoretical perspectives, research findings, or historical trends.

Program

N

M (SD)

BA

44

1.84 (.71)

BS

10

1.8 (.79)

 

2. I have a better understanding of basic research methods, including research design, data analysis, and interpretation and able to use critical and creative thinking in solving problems

Program

N

M (SD)

BA

44

1.84 (.92)

BS

10

1.5 (.71)

 

3. I now understand how psychological concepts can be used in everyday life and organization as well as the ethical complexities involved in applying psychology to social situations   

Program

N

M (SD)

BA

44

1.63 (.92)

BS 

10

1.6 (.52)

 

4. I developed effective communication skills including writing, interpersonal and oral skills, showing quantitative literacy and collaborating with others 

Program

N

M (SD)

BA

44

1.82 (.84)

BS 

10

1.7 (.67)

 

5. I benefited from studying psychology with regards to developing an understanding of mental processes, applying effective strategies for self-management, including self-regulation, and demonstrations of integrity. I also feel capable of applying my skills, values and information I received as a Psychology major to my future career. 

Program

N

M (SD)

BA

44

1.71 (.81)

BS

10

1.6 (.70)

 

Summary Table of t-test Analyses:

 

BS Entrance vs. Exit Data

SLO Item

Entrance M (SD)

Exit M (SD)

Significance

SLO 1

2.27 (1.19)

1.8 (.79)

p = .147

SLO 2

2 (1.26)

1.5 (.71)

p = .137

SLO 3

2.18 (1.25)

1.6 (.52)

p = .09

SLO 4

2.09 (1.3)

1.7 (.67)

p = .197

SLO 5

1.9 (1.29)

1.6 (.70)

p = .264

When looking at responses for BS students (see BA assessment report for further details on the BA program), overall, students reported strongly agree or agree to all SLOs. Importantly, there was a numerical trend in responses such that greater agreement was reported for all of the SLOs at the exit point in the degree, although statistically significant findings were not observed. One possible reason for the high scores in the entrance survey is that Psychology Majors must first successfully complete three core courses with a GPA of 2.5 or better before being admitted to the program. It is possible that the exposure to PSY 100 (Introductory Psychology), PSY 212 (Research Methods), and PSY 225 (Statistics) results in higher scores to begin with, making it difficult to find statistical significance in this survey.

Capstone Papers
 
The grading rubric was created for evaluating student writing performance and was approved by the Undergraduate Studies Committee and full faculty in 2014, and was implemented in the 2015-16 academic year. The Undergraduate Studies Committee is planning on revising the rubric prior to completing our next assessment. Specifically, a stronger separation between 'Sources' and 'Using evidence to support the author's perspective' is needed, as well as other refinements.
 
Mean scores are presented below, with possible scores ranging from 1 (Below expectations) to 3 (Exceeds expectations).
 

 

2018-2020 (n = 25)

Previous assessment (n=76)

Significance

Context of and purpose for writing

2.63

2.39

p = 0.039*

Using evidence to support the author's perspective

2.44

2.21

p = 0.075

Genre and disciplinary conventions

2.45

2.10

p = 0.006*

Sources

2.48

2.25

p = 0.082

Control of syntax and mechanics

2.49

2.12

p = 0.006*

 
The average scores seem to indicate that students are meeting or exceeding expectations in all categories assessed for writing performance. When comparing to the scores observed in our previous assessment it is clear that numerical improvement has been observed across all criteria, with significant improvement (p < 0.05) being observed for Context of and purpose for writing, Genre and disciplinary conventions, and Control of syntax and mechanics. It is interesting to note that in our previous assessment we noticed that students received lower scores with respect to genre and disciplinary conventions, as well as applying the correct control of syntax and mechanics. While it is unclear if the improvement in these scores was due to faculty efforts to provide additional instruction in these areas, or if the improvement was due to random sampling or a relaxation in the application of the rubric, it is nonetheless a welcome improvement.
 
The following table shows the percentage of students meeting expectations or higher (>2).
 

 

2020 assessment (n=25)

Previous assessment (2015-2018)

Context of and purpose for writing

97

89

Using evidence to support the author's perspective

90

88

Genre and disciplinary conventions

93

66

Sources

90

80

Control of syntax and mechanics

93

80

 

Online learning and Psychology Advising Office survey:
 
In the summer of 2020 we surveyed our undergraduate students (n=72) to gauge their perceptions of distance learning and virtual advising. Pertinent results are described below:
 
Question: How would you rate the transition of your in-person courses to an online platform?
 
Overall, 61% of students found the transition to online classes either not challenging or a little challenging, whereas only 17% of respondants found
the transition to be very challenging or extremely challenging. The remaining students found the transition moderately challenging.
 
Question: How has the move to an online platform impacted your ability to understand the material presented in the courses you are enrolled in?
 
Overall, 51.6% of students responded as neutral to this question, while 7.8% reported somewhat greater comprehension. 41% of students reported somewhat less comprehension or less comprehension.
 
Question: How has the move to an online platform impacted your level of engagement with the courses you are enrolled in?
 
Overall, only 9.4% of students reported being somewhat more engaged, while 23.4% of students were neutral for this question. 39.1% reported being somewhat less engaged and 28.1% reported being less engaged. The main reasons for this lack of engagement had to do with anxiety, internet problems, and limited access to instructors. Although, it should be noted that many positive aspects of online learning were mentioned.
 
Question: How important is it that your Psychology undergraduate advisors have a background in psychology?
 
87.7% of the sample said that having advisors with a background in psychology is either moderately important (24.5%), very important (36.7%), or extremely important (26.5%). 10.2% said that this was only a little important, and only one person said that it was not at all important. This affirms the department's decision to retain advising for our majors solely housed within the department of psychology.
 
Question: If both in-person and online advising were offered, which would you prefer and why?
 
Only 6.4% of the sample preferred online advising, with the the majority of the sample preferring both (55.3%) or in-person (38.3) advising.
 
Question: How would you rate the transition from in-person to online advising for Psychology?
 
Overall, 64.6% of the sample responded as neutral to this question, with an additional 18.8% and 10.4% feeling that advising was somewhat helpful or very helpful, respectfully. This was encouraging for the advising office as only 6.2% of the sample felt that virtual advising services were not helpful or somewhat helpful, suggesting that our efforts to continue to provide high quality advising in a virtual effort were succesful.
 

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

15) Please briefly describe how the program used its findings/results.

Entrance and Exit Surveys: The main purpose of the entrance and exit surveys is to provide insight into the strengths and weakness of our psychology courses, and whether or not students are improving in our targeted areas over the course of the degree (as related to the SLOs). The department has used feedback previously to create lab sections in our research methodology course (PSY 212) and some sections of PSY 100. This has been particularly useful for student learning, and as a department we strive towards being able to offer lab sections for every section of PSY 100 and 212, although budget constraints make this difficult. This is further exacerbated by the impending budget restrictions due to the coronavirus and the transition of Outreach courses to the day session. While the effects of the budget reduction are not yet know, the latter point significantly reduces the department's budget and ability to fund future lecturers.
 
Interestingly, statistically significant differences were not found when comparing entrance and exit surveys for the Psychology BS. This was also the case in our previous assessment report and we are unsure of what the underlying reason is for this. It should be noted that the sample contains only 21 students (11 entrance and 10 exit), and therefore it is very likely that we simply do not have enough power to capture any differences, despite the strong numerical trends that imply improvement across all SLOs. It is also possible that scores on the entrance exam were already high due to the fact that these students had taken at least PSY 100, 212, and 225 (and possibly additional psychology courses and science courses for the BS) prior to taking the entrance survey. Another potential possibility is that the survey instrument is not sufficiently sensitive to capture differences, or inaccuracies in the subjective nature of gauging one's knowledge are the reason for the lack of statistically significant differences. Regardless, the Undergraduate Studies Committee is planning on further discussing this survey and assessing potential changes. For example, it would be ideal to administer the survey to pre-psychology majors (i.e., before declaring the major). The conversion rate for pre-psychology to psychology would result in many lost surveys, but this would be an interesting avenue to explore.
 
Capstone Papers: While the entrance/exit survey represents an ongoing assessment effort over the past decade, the assessment of the capstone papers is relatively new (this is the second iteration of this assessment strategy). The present findings from the capstone paper assessment indicates that students are meeting or exceeding expectations in all categories assessed for writing performance. Our previous assessment of the capstone papers (2018) suggested that instructors could place additional effort in ensuring that students are acquiring the necessary knowledge with respect to genre and disciplinary conventions, as well as applying the correct control of syntax and mechanics. The latter was previously highlighted by the committee as an area of particular need for some students. The present results would seemingly suggest that our efforts have paid off as scores in these areas have significantly increased. However, it should be noted that the increase in scores could also be related to a sampling error, and as such we will continue to monitor this and adjust accordingly. Moreover, it should also be noted that it is impossible to distinguish between BS and BA students for these papers (individual instructors do not distinguish between the two in the classroom), and therefore it is unclear if one group is overrepresented in the results. 
 
Online learning and Psychology Advising Office survey: This was an important first step in what the Department of Psychology sees as a critical area in our future assessment efforts of the program. It is clear that more courses will be offered in an online format, with this transition being accelerated by the coronovirus. Our survey results suggest that while some students excel in and prefer online courses, others do struggle. We will look at ways to formalize this assessment and to compare results between the courses that are offered in both in-person and online formats. Additionally, we hope to compare performance in the degree between students who have taken PSY 100 and 212 with and without a lab sections (many transfer students have taken these courses at the community colleges without a lab section). Survey questions 1, 2, 3, and 5 would be especially useful in this assessment. Future online/in-person assessments are particularly important moving forward, especially for PSY 4X9 seminar courses. These courses are usually offered by core faculty and have an approved GenEd Focus, i.e., Writing, Ethics, Oral, etc.. Initially this was done to ensure that Psychology majors have an opportunity to interact with and learn from faculty members. We hope to continue with this tradition, but also assess the delivery of these courses in online and in-person formats.

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

Our assessment efforts have resulted in several important discoveries and achievements. First, the creation of this report has highlighted the need to expand our assessment efforts to account for distance learning. It is likely that the global pandemic will result in more classes being offered in online or hybrid formats. As such, it will be important to assess these different delivery formats and if they have any influence on SLOs. Second, by using the rubric a second time in our capstone assessment the Undergraduate Studies Committee realized that this rubric could be refined. Third, and related to the second point, the very act of assessing our program has lead to several interesting and important discussions among faculty at both the undergraduate studies committee and departmental level.

17) If the program did not engage in assessment activities, please justify.