Program: Economics (BA)
Degree: Bachelor's
Date: Mon Sep 07, 2015 - 1:17:50 pm
1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)
1. Economic literacy: Be able to clearly explain core economic terms, concepts and theories.
(1a. General education, 1b. Specialized study in an academic field, 2b. Conduct research, 2c. Communicate and report)
2. Critical thinking: Demonstrate the ability to apply economic reasoning to contemporary social issues and policy problems.
(1a. General education, 1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history, 2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report, 3a. Continuous learning and personal growth, 3b. Respect for people and cultures, in particular Hawaiian culture, 3c. Stewardship of the natural environment, 3d. Civic participation)
3. Quantitative reasoning: Apply appropriate quantitative and statistical techniques to economic analysis. Conduct economic analysis using equations and graphs.
(1a. General education, 2a. Think critically and creatively, 2b. Conduct research)
4. Reporting: Develop expertise needed to effectively communicate results of economic research and analysis to colleagues and decision-makers through written reports and oral presentations.
(1a. General education, 2b. Conduct research, 2c. Communicate and report, 3d. Civic participation)
2) Your program's SLOs are published as follows. Please update as needed.
Student Handbook. URL, if available online: http://www.economics.hawaii.edu/undergrad/UG_guide_2014.pdf
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: http://www.economics.hawaii.edu/courses.html
Other: On the department's intranet site accessible to our faculty only
Other:
3) Please review, add, replace, or delete the existing curriculum map.
- File (03/16/2020)
4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.
1-50%
51-80%
81-99%
100%
5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?
No (skip to question 16)
6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:
7) Briefly explain the assessment activities that took place in the last 18 months.
We implemented a course-embedded assessment program in our core department courses. In December 2014, we assessed student performance in two “Econ 300: Intermediate Macroeconomics”, one “Econ 301: Intermediate Microeconomics”, and two “Econ 321: Introduction to Statistics” courses. While grading the final exams for their section, instructors scored at least 50 percent of randomly selected students using the department’s scoring rubric that was developed in 2005-2006.
8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)
Direct evidence of student learning (student work products)
Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Other 1:
Other 2:
Indirect evidence of student learning
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Other 1:
Other 2:
Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:
9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.
While grading the final exams for their section, instructors scored at least 50 percent of randomly selected students using the department’s scoring rubric. 92 students were assessed in the Fall of 2014: 30 from Econ 300, 11 from Econ 301, and 51 from Econ 321.
10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:
11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:
12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.
The results of the December 2014 assessment showed a deviation from earlier trends. There was a marked improvement in Quantitative Reasoning with 91% of students receiving a 2 or 3 as compared to 78% in 2013 and 75% in 2012. In the past, students performed well on the Economic Literacy, Critical Thinking, and Reporting SLOs but were weaker in Quantitative Reasoning. Around 80% of students scored a 2 or 3 in these three SLOs, whereas they were over 90% in December 2013 and over 90% again in December 2012 for two of them (Economic Literacy and Reporting).
13) What best describes how the program used the results? (Check all that apply.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:
14) Please briefly describe how the program used the results.
To assist with establishing department SLOs among younger students, we have participated in the Chancellor's TA Initiative. Beginning in Fall 2012, the economics department received one additional TA per year for two years. The TA was used to offer a lecture/discussion format in one “Econ 130: Principles of Microeconomics” class every semester. We have continued with this practice after the additional TA position was not available anymore. The graduate student teaching assistants teach several small (10-25 student) discussion sections each week that supplement the regular lectures. This allows students to participate in a smaller classroom setting than Econ 130 traditionally offers. We have found evidence of an increase in performance among students in the middle of the grade distribution. Students who regularly attended the discussion sections appeared to benefit from the increased instructor attention, opportunity to have questions answered, and participation in group work.
We have also started to offer a course specially focused on quantitative skills on a regular basis: “Econ 420: Mathematical Economics” every Fall semester. This class seeks to increase quantitative reasoning skills among more senior students.
In collaboration with Mathematics Department, we helped develop the course “Math 161: Pre-calculus and Elements of Calculus for Economics and Social Sciences”. The course is particularly beneficial for students in economics and social sciences since the course uses specific examples frequently used in Economics as well as in other fields of Social Sciences.
The department currently operates an exit survey consisting of questions asked of students just before graduation. The survey was designed to obtain more information about student employment outcomes to help the department keep in touch with graduates and promote the program to undeclared majors by citing examples of successful graduates. Currently, the survey asks a few additional questions about most and least favorite classes, semesters to graduation, and transfer status. The department is considering modifying the survey to obtain additional indirect evidence of measures of student learning outcomes at the time of graduation, not just in the intermediate classes, but this work remains preliminary and no data is available to report at this time.
15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.
N.A.
16) If the program did not engage in assessment activities, please explain.
N.A.