Unit: Economics
Program: Economics (PhD)
Degree: Doctorate
Date: Wed Sep 02, 2015 - 11:13:47 am

1) Below are your program's student learning outcomes (SLOs). Please update as needed.

The goal of the Ph.D. program is to train professional economists for careers in teaching, research and policy analysis. 

1. Students will demonstrate an understanding of economic theory and analytical and quantitative tools.

2. Students will demonstrate an ability to understand, integrate, and apply the various tools, concepts, and principles of economics and quantitative methods to analyze and to develop solutions to economic problems in a clear and concise written form.

3. Students will demonstrate a "frontier" level competency and familiarity with the literature in the student's perceived specialty area.

4. Students will demonstrate the ability to conduct independent and original research in economics.

5. Students will have the skills necessary to qualify for teaching positions at the university and college levels, and for research positions in the public or private sector.

6. Program graduates will be able to obtain employment that uses the level of expertise obtained in the Ph.D. program.

7.  Students will complete these goals according to the timeline described in the graduate program guidelines.

1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: http://www.economics.hawaii.edu/grad/index.html
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online:
Other: In the Department's intranet website accessible to faculty only.
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2015:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?

Yes
No (skip to question 16)

6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

7) Briefly explain the assessment activities that took place in the last 18 months.

In Sept., 2014, our faculty approved the use of a new Program Assessment Form to be used at the conclusion of every doctoral thesis proposal and thesis final defense.  We wanted to assess how well our program was meeting its program SLO’s 1-4.  In the ensuing months, the assessment coordinator had to remind faculty several times about using the form, and devise a new strategy to make sure the forms were being filled out by colleagues.  By Sept., 2015, 9 forms had been filled out by thesis committees.  

8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)

Direct evidence of student learning (student work products)


Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Other 1: thesis proposal defense and final defense
Other 2:

Indirect evidence of student learning


Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Other 1:
Other 2:

Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)


Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

Evidence was collected from 9 thesis proposal (oral) defenses or final thesis (oral) defenses.

10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other: all faculty involved in the 9 thesis committees

11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.

Getting faculty to use a new form is not easy, especially when many are involved in thesis defenses only once or twice a year, so it is easy to forget the new procedures.  The assessment coordinator found the most effective means of insuring compliance was to physically staple the new form onto the UH doctorate forms II or III that are available in the department office, and to encourage faculty and graduate students to pick up these forms from the office rather than downloading and printing them themselves.

Regarding SLO’s 1-4 targeted by the new forms, we found that our program was evaluated as meeting or exceeding expectations 8 times for all 4 SLO’s, and was evaluated as approaching expectations for SLO’s 1 and 2 once on a proposal defense.  Unsurprisingly, our program's marks tended to be higher for thesis final defenses than for proposal defenses. In 3 cases for final defenses, our program was evaluated as exceeding expectations on all 4 SLO’s.

13) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

14) Please briefly describe how the program used the results.

Assessment coordinator plans to regularly remind faculty about using the new forms at every thesis defense. We plan to discuss the results of the 9 forms collected thus far at our next faculty meeting.

15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

No

16) If the program did not engage in assessment activities, please explain.

NA