Unit: Economics
Program: Economics (BA)
Degree: Bachelor's
Date: Tue Oct 07, 2014 - 10:34:16 am

1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)

1. Economic literacy: Be able to clearly explain core economic terms, concepts and theories.

(1a. General education, 1b. Specialized study in an academic field, 2b. Conduct research, 2c. Communicate and report)

2. Critical thinking: Demonstrate the ability to apply economic reasoning to contemporary social issues and policy problems.

(1a. General education, 1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history, 2a. Think critically and creatively, 2b. Conduct research, 2c. Communicate and report, 3a. Continuous learning and personal growth, 3b. Respect for people and cultures, in particular Hawaiian culture, 3c. Stewardship of the natural environment, 3d. Civic participation)

3. Quantitative reasoning: Apply appropriate quantitative and statistical techniques to economic analysis. Conduct economic analysis using equations and graphs.

(1a. General education, 2a. Think critically and creatively, 2b. Conduct research)

4. Reporting: Develop expertise needed to effectively communicate results of economic research and analysis to colleagues and decision-makers through written reports and oral presentations.

(1a. General education, 2b. Conduct research, 2c. Communicate and report, 3d. Civic participation)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL:
Student Handbook. URL, if available online: http://www.economics.hawaii.edu/undergrad/UGguide.pdf
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: http://www.economics.hawaii.edu/courses.html
Other:
Other:

3) Select one option:

Curriculum Map File(s) from 2014:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Did your program engage in any program assessment activities between June 1, 2013 and September 30, 2014? (e.g., establishing/revising outcomes, aligning the curriculum to outcomes, collecting evidence, interpreting evidence, using results, revising the assessment plan, creating surveys or tests, etc.)

Yes
No (skip to question 14)

6) For the period between June 1, 2013 and September 30, 2014: State the assessment question(s) and/or assessment goals. Include the SLOs that were targeted, if applicable.

The objective of the assessment is to evaluate the current status of our undergraduate students with respect to the SLOs listed in question 1.

The department uses a grading rubric to determine how our students perform in the following areas (from low performance (1) to high performance (3)).

1.  Understand and apply economic concepts and theories in a clear and effective manner

1)      Does not understand nor apply economic concepts; is confused (ex: struggles with definitions of important terms and ideas used for subject matter of class)

2)      Describes economic concepts, but does not clearly understand or apply them (ex: correctly defines important terms and ideas, but cannot correctly apply them to the situation at hand)

3)      Understands and applies economic concepts and theories in a clear and effective manner (ex: correctly defines and applies important terms and ideas)

2.      Think critically and solve problems

1)      Does not identify question at hand, and fails to think critically and solve problems (ex: does not know which concepts a question is asking about)

2)      Identifies question at hand, but fails to think critically and solve problems (ex: knows which concepts and ideas a question is addressing, but is unable to use these ideas to find a solution)

3)      Identifies question at hand, thinks critically and solves problems in an illuminating way (ex: uses concepts and ideas correctly to discuss solutions to problems)

 3.      Demonstrate quantitative skills

1)      Does not understand or apply quantitative skills to the topic/issue (ex: unable to correctly translate a word problem into mathematical expression)

2)      Uses quantitative skills relevant to the topic/issue but applies them incorrectly or in an incomplete manner (ex: sets up a problem correctly, but is unable to solve it correctly)

3)      Uses quantitative skills to address the issue/topic at hand (ex: sets up and solves problems correctly)

 4.      Communicate your findings both orally and in writing 

1)      Fails to orally communicate findings in a meaningful way and/or fails to communicate findings in writing (ex: writing is unclear to the point that it is not clear if student understands core ideas and concepts)

2)      Communicates findings orally, but fails to stimulate interest from audience and/or communicates findings in writing in an unclear manner (ex: student appears to understand key ideas and concepts but does not express these ideas clearly and succinctly)

3)      Clearly communicates findings orally and stimulates interest and discussion from the audience and communicates findings in writing in a clear and stimulating manner (ex: student can quickly relate key ideas and concepts from class in a way that leaves no doubt as to student's understanding of material)

 

7) State the type(s) of evidence gathered to answer the assessment question and/or meet the assessment goals that were given in Question #6.

We implemented a course-embedded assessment program.  In December 2013, we assessed student performance in two Econ 300 courses, two Econ 301 courses, and one Econ 321 course.  While grading the final exams for their section, instructors scored at least 50 percent of randomly selected students using the department’s scoring rubric that was developed in 2005-2006.

8) State how many persons submitted evidence that was evaluated. If applicable, please include the sampling technique used.

While grading the final exams for their section, instructors scored at least 50 percent of randomly selected students using the department’s scoring rubric. 91 students were assessed in the Fall of 2013: 32 from Econ 300, 34 from Econ 301, and 25 from Econ 321.

9) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

10) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

11) For the assessment question(s) and/or assessment goal(s) stated in Question #6:
Summarize the actual results.

The results of the December 2013 assessment showed that students performed well on the Economic Literacy, Critical Thinking, and Reporting SLOs, with over 90% of students receiving a 2 or 3 in these areas.  However, only 78% received a 2 or 3 in Quantitative Reasoning.  This indicates a relative weakness among our students in using equations, graphs, and descriptive statistics to perform economic analyses.  This pattern is consistent with trends noted in previous years, with the notable exception of an increase in the Critical Thinking SLO this year.  Last year, approximately 80% of students scored a 2 or 3 in this measure.

One explanation for the difference from last year may lie in the smaller sample size of the assessment: approximately 20 fewer students were assessed this year vs. last year thanks to smaller enrollments across the intermediate level classes.  Smaller class sizes may have given professors the opportunity to spending more time sharpening critical reasoning skills among students without a decline in other areas. 

 

12) State how the program used the results or plans to use the results. Please be specific.

In collaboration with Mathematics Department, we helped develop the course Math 161: Pre-calculus and Elements of Calculus for Economics and Social Sciences. The course is particularly beneficial for students in economics and social sciences since the course uses specific examples frequently used in Economics as well as in other fields of Social Sciences. Beginning the Fall of 2011, we offered this course every semester for which about 30-60 students registered.  Efforts to cross-list the course with the economics department are ongoing.

Given the huge variation amongst students, we have also offered a course specially focused on quantitative skills. Econ 420: Mathematical Economics has been offered 4 times since Fall 2010.  This class seeks to increase quantitiative reasoning skills among more senior students.

To assist with developing quantitative reasoning skills among younger students, we have participated in the Chancellor's TA Initiative.  Beginning in Fall 2012, the economics department received one additional TA per year.  This individual is currently being used to offer a lecture/discussion format in one Econ 130: Principles of Microeconomics class every semester.  The graduate student teaching assistants teach several small (10-25 student) discussion sections each week that supplement the regular lectures.  This allows students to participate in a smaller classroom setting than Econ 130 traditionally offers.   From the first two years of this class, we have found evidence of an increase in performance among students in the middle of the grade distribution.  Students who regularly attended the discussion sections appeared to benefit from the increased instructor attention, opportunity to have questions answered, and participation in group work.  However, it is not clear that the structure reduced DFIW rates, as the number of students benefitting from this class structure remains small.  In coming years, we hope to see that this additional investment in helping students acquire foundational skills in economics will improve their quantitative reasoning skills as assessed at the intermediate level and improve the measured student success rate, and we will continue to experiment with the format of TA-led discussion sections to determine how best to achieve this goal.

The department currently operates an exit survey consisting of questions asked of students just before graduation.  The survey was designed to obtain more information about student employment outcomes to help the department keep in touch with graduates and promote the program to undeclared majors by citing examples of successful graduates.  Currently, the survey asks a few additional questions about most and least favorite classes, semesters to graduation, and transfer status.  The department is considering modifying the survey to obtain additional indirect evidence of measures of student learning outcomes at the time of graduation, not just in the intermediate classes, but this work remains preliminary and no data is available to report at this time.

The department has also considered disaggregating student information by using student ID numbers from class assessments to link to the Banner system to obtain student level data.  The goal of this exercise would be to see if certain identifiable classes of students are having more or less success in meeting learning objectives.  For instance, if the weakness of quantitative reasoning skills is due primarily to weaknesses of transfer students, then offering the Math 161 class to freshmen at Manoa is unlikely to be a good solution to the problem.  At this time, the department decided not to undertake this task due to limited office support for performing the data matching, but it remains an interesting option for future assessments.

 

13) Beyond the results, were there additional conclusions or discoveries?
This can include insights about assessment procedures, teaching and learning, program aspects and so on.

N.A.

14) If the program did not engage in assessment activities, please explain.
Or, if the program did engage in assessment activities, please add any other important information here.

N.A.