Unit: Architecture
Program: Environmental Design (BEnvD)
Degree: Bachelor's
Date: Thu Nov 19, 2020 - 12:06:29 pm

1) Program Student Learning Outcomes (SLOs) and Institutional Learning Objectives (ILOs)

1. Design Skills and Methods Understand the variety of design methods and demonstrate ability in applying them to analyze contexts, formulate concepts, evolve multiple solutions, and critically judge final designs incorporating cultural, technological, aesthetic, and ethical concerns.

(1b. Specialized study in an academic field, 2a. Think critically and creatively, 2b. Conduct research, 3a. Continuous learning and personal growth, 3d. Civic participation)

2. Design Communication Ability to use a variety of analog, digital, verbal, and written means to conceptualize, represent, and clearly communicate critical and complex design proposals.

(1b. Specialized study in an academic field, 2a. Think critically and creatively, 2c. Communicate and report)

3. Design Technology Understand materials, methods, and technological systems in environmental design communication and the construction of built environments, and be able to critically evaluate and apply them in final design solutions.

(1b. Specialized study in an academic field)

4. Sustainability in Environmental Design Understand and design projects that optimize, conserve, or reuse natural and built resources to provide healthful environments to users, and reduce the negative environmental impacts of building construction and operations on future generations.

(1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history, 2b. Conduct research, 3b. Respect for people and cultures, in particular Hawaiian culture, 3c. Stewardship of the natural environment)

5. Interdisciplinary Problem Solving and Design Research Understand and engage in collaborative interdisciplinary team-based research using appropriate methodologies in order to arrive at increased understanding and derive holistic and responsible environmental design solutions connecting to diverse technological, social, cultural, and environmental concerns.

(1a. General education, 1b. Specialized study in an academic field, 2a. Think critically and creatively, 2b. Conduct research, 3a. Continuous learning and personal growth, 3b. Respect for people and cultures, in particular Hawaiian culture, 3d. Civic participation)

6. History and Theory in Environmental Design Understand the historical and theoretical forces which impact current design thinking and provide critical insight into the shaping of cultural and social relationships, values, and decisions about the built and natural environment.

(1a. General education, 1b. Specialized study in an academic field, 1c. Understand Hawaiian culture and history, 2c. Communicate and report, 3b. Respect for people and cultures, in particular Hawaiian culture)

7. Professional Practice Understand the roles, methods, collaborative processes, and ethical considerations of the environmental design professions and their impact on local and global environmental contexts.

(1b. Specialized study in an academic field, 2b. Conduct research, 3a. Continuous learning and personal growth, 3d. Civic participation)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL:
Student Handbook. URL, if available online:
Information Sheet, Flyer, or Brochure URL, if available online:
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: SLOs relevant to each course are listed on the course syllabus
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2020:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Does the program have learning achievement results for its program SLOs? (Example of achievement results: "80% of students met expectations on SLO 1.")(check one):

No
Yes, on some(1-50%) of the program SLOs
Yes, on most(51-99%) of the program SLOs
Yes, on all(100%) of the program SLOs

6) Did your program engage in any program learning assessment activities between November 1, 2018 and October 31, 2020?

Yes
No (skip to question 17)

7) What best describes the program-level learning assessment activities that took place for the period November 1, 2018 and October 31, 2020? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate other pressing issue related to student learning achievement for the program (explain in question 8)
Other:

8) Briefly explain the assessment activities that took place since November 2018.

At the end of each semester, faculty gather to discuss student work from required and elective studios. We judge the work according to the National Architecture Accrediting Board's "high pass" and "low pass" model. Each instructor chooses student work from the two categories for each studio and presents the work to the entire faculty group. In the lower division studios, the faculty visits each studio to view student displayed work as a whole. We discuss the relative outcomes for different project assignments, as well as evaluate if most of the students are meeting SLO and NAAB SPC criteria for each studio. We use these qualitative discussions in order to determine whether to institute curricular changes. In 2020, we established a new set of SLOs for the program in order to measure student successes according to Assessment Center models. 

Since establishing the new SLOs, we have had one end of semester faculty review (Spring 2020); however, due to the pandemic, it was held online which limited the level of full-day and faculty-wide assessment at the rigorous level as is typical of our program.

In 2018-2019, we altered the sequencing and content of first and second year studios in response to these evaluations, by introducing computer applications coursework (a new ARCH 102) earlier in the sequence. our evaluation of Spring 2020 shows that this has been a successful change thus far.

 

9) What types of evidence did the program use as part of the assessment activities checked in question 7? (Check all that apply.)

Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1: Display of student work in individual studios.
Other 2: Faculty end of semester "Review Day" to discuss student achievement in each required and elective studio.

10) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

The number of students' work reviewed varies from semester to semester and because of the format of the Faculty Review Day and the methods by which each individual studio instructor chooses to present their students' work. Typically, we judge four projects from each design studio (two high pass and two low pass) in presentation format, as well as view the work of entire studios for ARCH 101, 102, 201, and 202. These reviews involve assessing work of approximately 150 students per semester.

11) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other: Entire SoA faculty, including Lecturers.

12) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

13) Summarize the results from the evaluation, analysis, interpretation of evidence (checked in question 12). For example, report the percentage of students who achieved each SLO.

Over the course of the assessment period, we have not assessed student work according to rubrics or other quantitave measures.

Instead, faculty has viewed the work from each studio as a qualitative whole, in order to identify areas in which a particular studio (and associated syllabus content) has achieved its stated NAAB and SLO criteria, as well as evidenced increased student aptitude in design as they move progressively through the studio sequence.

Please see our discussion on page 16 regarding our proposed development of rubrics for future assessment activities.

14) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

15) Please briefly describe how the program used its findings/results.

In 2019, after reviewing the work from our course sequence of ARCH 101, ARCH 132, ARCH 235, and ARCH 201 in our Faculty Review Days, we recognized that students needed the content from ARCH 235 earlier in their educational path. We therefore altered our program chart in order to introduce the material in ARCH 235 Computer Applications in Design (formerly the Fall semester studio of the second year) earlier in the program. This course became ARCH 102 and is now the Spring studio in the first year, rather than the Fall studio in the second year. The material from ARCH 132 was revised as the Fall studio in the second year. We believe that this re-sequencing, and retitling of the classes as: ARCH 101 Design Fundamentals I; ARCH 102 Design Fundamentals II; ARCH 201 Beginning Design I; and ARCH 202 Beginning Design II offers a clearer path for students to build their design skill sets.

Please review the changes in the Curriculum maps uploaded with this report. The earlier curriculum map (2019) was replaced by the the new sequence labeled 2020, which reflects the curriculum changes that were made as a result of Faculty Review Day discussions in 2018-19.

 

16) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

The Dean and program director have reviewed the helpful presentation from the Mānoa Assessment and Curricular Support Center regarding the creation of rubrics from our SLOs on which to measure student performance in more precise percentages from the external review of the BEnvD coordinated through the OVCAA. The recommendations are very useful, and we wish to thank the Assessment Office for their work on this. It has been distributed to faculty for their consideration and input. 

On our Faculty Review Day this coming December of 2020, the program director will allot a time during the review presentations to discuss how we might best create rubrics for judging our students work beyond the “high pass” and “low pass” models we use for overall curricular improvements. Because of the qualitative nature of design work, we will need to discuss how best to implement new rubrics and the ways in which doing so might be helpful to judge whether a given percentage of students is meeting SLO criteria for each course. 

Students are often working across multiple SLOs on any given design assignment and their grades are based across multiple projects. The qualitative nature of design work makes it more difficult to determine rubrics for whether a simple percentage of students meets all of (or some of) the SLOs for a course. Traditional rubrics for measuring students’ achievement are less straightforward in the BEnvD (especially in terms of specific grades for the class), since a student may meet some SLOs in some projects across the semester, but may have suboptimal results in others: One example might be a student whose half-semester project final project earns high marks in SLO 1 and 2, but fails to meet expectations in SLO 3 and 4; and then subsequently, their final project might meet SLO 3 and 4, but lack achievement in SLO 1 and 2. Such a student would therefore have to be counted equally  (positively and negatively) in these SLO categories of achievement – and one might cancel out the other. The qualitative nature of design project assignments thus substantially complicates traditional percentage-based assessment rubrics, which are based on a student’s demonstration of quantifiable knowledge measured by papers and exams.

Taking these issues into account, following the Faculty Review Day Fall 2020 discussion about Assessment, we propose to create a subcommittee, commencing in Spring 2021, in order to develop potential rubrics that can address the complexity of measuring SLO criteria met in BEnvD student design coursework. 

For non-studio classes, we expect that the development of rubrics should be more straightforward. The faculty subcommittee will address these and all courses in consultation with instructors. The program director and subcommittee will plan to meet with the Assessment Center to discuss our proposed strategies for measuring our students’ performance, with the goal to create a more mathematical model of student outcomes across all SLOs.

17) If the program did not engage in assessment activities, please justify.