Unit: Economics
Program: Economics (BA)
Degree: Bachelor's
Date: Fri Oct 12, 2018 - 10:55:24 am

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. Economic literacy: Be able to clearly explain core economic terms, concepts and theories.

(1a. General education, 1b. Specialized study in an academic field, 2b. Conduct research, 2c. Communicate and report)

2. Critical thinking: Demonstrate the ability to apply economic reasoning to contemporary social issues and policy problems.

(1a. General education, 1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history, 2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report, 3a. Continuous learning and personal growth, 3b. Respect for people and cultures, in particular Hawaiian culture, 3c. Stewardship of the natural environment, 3d. Civic participation)

3. Quantitative reasoning: Apply appropriate quantitative and statistical techniques to economic analysis. Conduct economic analysis using equations and graphs.

(1a. General education, 2a. Think critically and creatively, 2b. Conduct research)

4. Reporting: Develop expertise needed to effectively communicate results of economic research and analysis to colleagues and decision-makers through written reports and oral presentations.

(1a. General education, 2b. Conduct research, 2c. Communicate and report, 3d. Civic participation)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL:
Student Handbook. URL, if available online: http://www.economics.hawaii.edu/undergrad/UG_guide_2016.pdf
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: https://drive.google.com/drive/folders/0B28X6RiBdo5kSHoyOHplYVlBdGM
Other: On the department's intranet site accessible to our faculty only
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2018:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

No
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between June 1, 2015 and October 31, 2018?

Yes
No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period June 1, 2015 to October 31, 2018? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
No (skip to question 17)
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

8) Briefly explain the assessment activities that took place.

We implemented a course-embedded assessment program in our core department courses. In December 2016, we assessed student performance in one “Econ 300: Intermediate Macroeconomics”, two “Econ 301: Intermediate Microeconomics”, and two “Econ 321: Introduction to Statistics” courses. In December 2017, we assessed student performance in two “Econ 300: Intermediate Macroeconomics”, three “Econ 301: Intermediate Microeconomics”, and two “Econ 321: Introduction to Statistics” courses.While grading the final exams for their section, instructors scored at least 50 percent of randomly selected students using the department’s scoring rubric that was developed in 2005-2006.

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

While grading the final exams for their section, instructors scored at least 50 percent of randomly selected students using the department’s scoring rubric. 84 students were assessed in the Fall of 2016: 15 from Econ 300, 24 from Econ 301, and 45 from Econ 321. 130 students were assessed in the Fall of 2017: 37 from Econ 300, 26 from Econ 301, and 67 from Econ 321.

11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

13) Summarize the results of the assessment activities checked in question 7. For example, report the percentage of students who achieved each SLO.

The results of the December 2016 assessment showed a few differences from earlier reports. There was a small drop in Critical Thinking from 80% in 2014 report to 77% of students receiving a 2 or 3. There was also a small increase in Reporting from 80% in 2014 to 82% of students receiving a 2 or 3. Although there was a marked decrease in Quantitative Reasoning from 91% in 2014 to 85% of students receiving a 2 or 3, the current figure is still higher than those of 2013 (78%) and 2012 (75%). There was no obvious change in Economic Literary.

The results of the December 2017 assessment however showed an oveall decrease from 2016 in all four categories: Economic Literacy dropped slightly from 79% to 76%; Critical Thinking dropped slightly from 77% to 73%; Quantitative Reasoning dropped from 85% to 73%; and Reporting dropped from 82% to 73%.

 

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

15) Please briefly describe how the program used the results.

The department has strengthend the mathematical training by offering Econ 256 Data Analysis and Visualization (starting spring 2019). We have also started a new Quantitative Economics Concentration (started in fall 2017). Both should better prepare our students with Quantitative Reasoning. They may also have positve spillover effects on Critical Thinking and Reporting.

We will look at past assessment reports and try to find out if the decreases observed between 2016 and 2017 is within the standard deviation from year to year. We did note that there is a rather large increase in the sample size from 2016 to 2017, thanks to better data collection efforts. One possibility could be that for years the number of observations is small (i.e. few number of classes, few number of students, etc.), the results might suffer sample selection biases. We will continue to try to make each sample as close to a full sample as possible. That is, we should always try to collect data from all 300/301/321 sessions. With such effort, we hope future assessment results will be more reliable and comparable across time.

 

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

N.A.

17) If the program did not engage in assessment activities, please justify.

N.A.