data collection

Collecting Evidence of Student Learning

This workshop will introduce you to common evidence collection methods used in program assessment: exams, portfolios, surveys, interviews, …

Assessment efforts at the University of Hawaiʻi at Mānoa strive to help departments meet learning objectives the departments have set for themselves. As part of these efforts, departments have designed Program Learning Outcome (PLOs) for the departments as a whole, and individual faculty members have established Student Learning Outcome (SLOs) for the classes they teach; the SLOs are coordinated with the PLOs. Collecting data to assess how well the SLOs and PLOs are met has proven problematic though, in large part because the data collection effort is widely viewed as an additional workload increase for faculty, who are already stretched thin with research, teaching, administrative, and community service responsibilities. Unsurprisingly, many faculty are inclined to collect and provide small amounts of data. If the data are too meager, however, then an assessment that is both useful and fair is not possible. To deal with this, the Department of Geology and Geophysics is asking individual faculty members to provide a subset of data they regularly collect anyway as a normal part of the their grading procedures; in this sense no new data are required. In one-on-one meetings with the department’s assessment coordinator, instructors identify a suite of student responses (e.g., particular exam questions, particular laboratory assignments, parts of writing assignments, etc.) that would be appropriate and sufficiently comprehensive to assess how well the course is meeting its SLOs. The initial feedback has been that this approach is reasonable in terms of the time required and is perceived as fair.

A time-effective and fair way to collect assessment data

Assessment efforts at the University of Hawaiʻi at Mānoa strive to help departments meet learning objectives the departments …

College of Education (COE) programs select six to eight key program assessments to systematically measure, collect and analyze data on student learning. Faculty members enter assessment data each semester into our in-house COE Student Information System (SIS). The SIS allows us to summarize and compare data within and across programs at all levels. The data allow us to test our beliefs about our programs and make adjustments based on actual candidate performance and stakeholder feedback. During the 2012-13 academic year, using the data entered into the SIS, we began creating visuals of aggregate student performance on key assessments and posting these data on our college website, under the header “Measuring Our Success.” The use of data visuals provides not only a more engaging means for faculty to analyze student performance, but also allows us to share this information with multiple audiences. Through meeting with multiple stakeholders, we have learned how important it is to make data available to the public, as well as to our own educational community. Examples of activities and changes based on student learning data that have led to continuous improvement of candidate performance in COE programs will also be provided.

Measuring Our Success

College of Education (COE) programs select six to eight key program assessments to systematically measure, collect and analyze …

This poster reviews Mathematics’ assessment activities. It describes the program goals, assessment plan, and current activities; presents a curriculum map; and includes samples of syllabi, which incorporate program goals. It also offers thoughts on systematic data collection and lessons learned.

An Overview of Assessment in the Mathematics Department

This poster reviews Mathematics’ assessment activities. It describes the program goals, assessment plan, and current activities; presents a …

First-Year Programs (FYP) utilizes multiple approaches to assess student learning outcomes and program success. Institutional data are used to measure retention rates for students participating in Access to College Excellence Learning Communities (ACE); National Student Clearing House (NSCH) data are gathered to measure student transfer rates. Qualitative data and feedback is collected through focus groups and surveys. In addition, ACE students complete two surveys measuring student expectations, engagement, and institutional commitment. Fall 2007 student engagement survey results indicate ACE students felt significantly more informed about core graduation requirements, major requirements, and registration procedures. Students also felt significantly more connected to the university community. NSCH data indicates that a large proportion of ACE students that did not continue at UHM transferred to other institutions after their first year. FYP will expand its evaluation by collecting Drop-Failure-Withdraw (DFW) rates for classes offered as part of an ACE learning community.

First-Year Success: Evaluating a Peer-Led Learning Community Program

First-Year Programs (FYP) utilizes multiple approaches to assess student learning outcomes and program success. Institutional data are used …

Graduate Program Learning Assessment: Processes, Tools, and Resources

This workshop is co-sponsored by the Assessment and Curriculum Support Center, Graduate Division, and the Graduate Council. Wondering …

Choose a Method to Collect Data/Evidence

Last Updated: 4 March 2024. Click here to view archived versions of this page. The data or evidence in a …

Using Portfolios in Program Assessment

On this page: 1. What is a portfolio? Back to Top A portfolio is a systematic collection of …