Unit: Second Language Studies
Program: Second Language Studies (MA)
Degree: Master's
Date: Mon Sep 28, 2015 - 2:52:51 pm

1) Below are your program's student learning outcomes (SLOs). Please update as needed.

MA in SLS

All students graduating with the MA in SLS will achieve the following learning outcomes:

1. Knowledge Base of Second Language Studies.

Our graduates will develop familiarity with topics and concepts fundamental to the broad knowledge base of the field of Second Language Studies, including: (a) the scope of issues and methods in applied linguistics, (b) linguistic analysis, (c) second language acquisition, and (c) sociolinguistics. They will also understand how their own interests in SLS relate to the larger academic, educational, and sociopolitical contexts of the discipline.

2. Utilization of Research.

Our graduates will be able to access, understand, and critically evaluate the current SLS research literature and engage in systematic investigation of topics and concepts in the SLS knowledge base to inform their own and others' professional practices.

3. Professionalism.

Our graduates will acquire the disposition to continue professional development for the duration of their careers, seeking increased knowledge of themselves and the discipline while remaining flexible and open to change. To do so, they will acquire the skills to communicate and interact effectively with their colleagues, in order to promote effective and ethical professional environments. In addition, our graduates will be able to communicate skillfully about their SLS work, both orally (e.g., at work or professional meetings) and in writing (e.g., through in-house reports and/or articles in professional newsletters and journals).

For students pursuing one of the five MA in SLS specializations, additional learning outcomes are associated with each. [Available upon request, but not included here due to length]

The program is designed to meet the needs of students who wish to anchor their professional practice in language-related lines of work in relevant theory and research, and students who are primarily interested in becoming SLS researchers. In order to meet these diverse needs, the program offers a small set of foundational core courses and a wide range of electives, some of which are required in one or more of the five program specializations. SLS courses address sociocultural, psychological, and linguistic aspects of language use, learning, and education, as well as applied linguistic research methodology. By emphasizing the interdependence of theory, research, and professional practice, we cultivate in our students the intellectual basis for an understanding of principles that will help guide them in their future careers. Graduates of the MA program assume key positions in a number of areas of applied linguistics, including teaching (both public and private sectors in the United States and abroad), teacher education, administration, research, evaluation, materials writing, and language-related professions outside of language education. A substantial number of students continue their graduate education in doctoral (usually PhD) programs

 

1) Institutional Learning Objectives (ILOs) and Program Student Learning Outcomes (SLOs)

2) Your program's SLOs are published as follows. Please update as needed.

Department Website URL: http://www.hawaii.edu/sls/graduate/ma/ma-program/
Student Handbook. URL, if available online: under revision, available online soon
Information Sheet, Flyer, or Brochure URL, if available online: NA
UHM Catalog. Page Number:
Course Syllabi. URL, if available online: NA
Other:
Other:

3) Please review, add, replace, or delete the existing curriculum map.

Curriculum Map File(s) from 2015:

4) For your program, the percentage of courses that have course SLOs explicitly stated on the syllabus, a website, or other publicly available document is as follows. Please update as needed.

0%
1-50%
51-80%
81-99%
100%

5) Did your program engage in any program learning assessment activities between June 1, 2014 and September 30, 2015?

Yes
No (skip to question 16)

6) What best describes the program-level learning assessment activities that took place for the period June 1, 2014 to September 30, 2015? (Check all that apply.)

Create/modify/discuss program learning assessment procedures (e.g., SLOs, curriculum map, mechanism to collect student work, rubric, survey)
Collect/evaluate student work/performance to determine SLO achievement
Collect/analyze student self-reports of SLO achievement via surveys, interviews, or focus groups
Use assessment results to make programmatic decisions (e.g., change course content or pedagogy, design new course, hiring)
Investigate curriculum coherence. This includes investigating how well courses address the SLOs, course sequencing and adequacy, the effect of pre-requisites on learning achievement.
Investigate other pressing issue related to student learning achievement for the program (explain in question 7)
Other:

7) Briefly explain the assessment activities that took place in the last 18 months.

All three listed under Q 6, first response option. We continued to monitor our MA students' progress, achievement of SLOs and PLOs, not to mention satisfaction with the program and the successful placement of alumni. However, unchanged from 2014, resource lacks have prevented more formal approaches; our professorial FTE is 20% below normal, there are major staff lacks in the undergraduate and language teaching programs of this department which have not been addressed by the administration despite repeated requests, and GA and lecturer support have been trimmed. Accordingly, we have had to narrow our assessment efforts, focusing primarily on the BA program (see BA program report). The College of Languages, Linguistics & Literatures is struggling with a $1.6 million deficit and consequently lacks the means to support resource-demanding departmental assessment efforts. In broader perspective, the capability of UH Manoa to provide a full-service university is questionable in the face of an unwillingness on the part of the people's representatives in the State Legislature to actually fund the university, its faculty and staff, and support its students, in a manner most likely to provide quality higher education.

We monitored students' progress while in the program using course evaluation procedures based on quizzes, examinations, class presentations, and final projects.

The MA program naturally uses a range of assessment techniques. Major assessments related to student learning outcomes (and the educational factors that contribute to them) include:

1. Scholarly Paper / Thesis: For the MA program there is a scholarly work requirement which is intended to reflect each student’s ability to: (a) engage in thoroughgoing research that is relevant to the field of SLS; (b) persist in longterm scholarly projects, from inception to dissemination; and (c) produce high quality publishable writing. These assessments offer valuable insights into the extent to which students have achieved primary learning outcomes, such as familiarity with the broad content that describes particular domains of SLS, understanding and competent implementation of relevant research methods, and professional level abilities to communicate about their work. Both of the assessment formats involve multiple stages of (a) research conceptualization (formalized for the Thesis in a proposal, and its defense), (b) research, (c) writing, (d) feedback, and (d) completion of a final version (with formal public review in the case of the thesis, and internal two reader review in the case of the scholarly paper).

2. Graduating student survey: The College of LLL exit survey has subsumed our previous internal survey of graduating students. Within the redesign, questions have been generated specifically for the MA program, and these questions target both the levels of learning in key outcomes areas and the perception of professional value of these outcomes. In addition to these department-internal questions, SLS stakeholders have advised the College of LLL on the design of general questions to ask of all graduating students in the college. The Department has benefited from the recent administration of this survey again in the past year.

3. Professional activities review: We normally gather data on current and former students' publication and presentation activities over the preceding year. These professional activities provide one key indicator of the extent to which our students in the three graduate programs are developing and maintaining professional profiles that are of recognized value by the second language studies disciplines. Because of lack of staff and professorial faculty over the past two academic years, we have not analyzed these data nor been able to present them systematically.

4. Alumni survey and review. The new Departmental website has begun to show alumni places / institutions ofemployment. However systematic collection and analysis of pertinent data is beyond Departmental capacity under current  funding conditions. The alumni data that is supplied by UH Foundation is not fully current or correct (despite the fact that this is a UHF responsibility) so it will take some time before this can be brought back to an adequate degree of usability.

8) What types of evidence did the program use as part of the assessment activities checked in question 6? (Check all that apply.)

Direct evidence of student learning (student work products)


Artistic exhibition/performance
Assignment/exam/paper completed as part of regular coursework and used for program-level assessment
Capstone work product (e.g., written project or non-thesis paper)
Exam created by an external organization (e.g., professional association for licensure)
Exit exam created by the program
IRB approval of research
Oral performance (oral defense, oral presentation, conference presentation)
Portfolio of student work
Publication or grant proposal
Qualifying exam or comprehensive exam for program-level assessment in addition to individual student evaluation (graduate level only)
Supervisor or employer evaluation of student performance outside the classroom (internship, clinical, practicum)
Thesis or dissertation used for program-level assessment in addition to individual student evaluation
Other 1:
Other 2:

Indirect evidence of student learning


Alumni survey that contains self-reports of SLO achievement
Employer meetings/discussions/survey/interview of student SLO achievement
Interviews or focus groups that contain self-reports of SLO achievement
Student reflective writing assignment (essay, journal entry, self-assessment) on their SLO achievement.
Student surveys that contain self-reports of SLO achievement
Other 1: Consultations with course instructor
Other 2: Consultations with faculty advisor, advisor's / readers of scholarly paper, chair and members of thesis committee

Program evidence related to learning and assessment
(more applicable when the program focused on the use of results or assessment procedure/tools in this reporting period instead of data collection)


Assessment-related such as assessment plan, SLOs, curriculum map, etc.
Program or course materials (syllabi, assignments, requirements, etc.)
Other 1:
Other 2:

9) State the number of students (or persons) who submitted evidence that was evaluated. If applicable, please include the sampling technique used.

All instructors are involved in this process. Efforts are made to keep a high rate of response in regard to studentsinvolved in this process. In particular, course level evaluation exercises draw on materials normally collected by instructors in the course of their pedagogical activities, thus ensuring a high rate of return. Key course evaluation activities, especially end of course evaluation data is conducted in person rather than remotely— generally high rates of return are obtained.

10) Who interpreted or analyzed the evidence that was collected? (Check all that apply.)

Course instructor(s)
Faculty committee
Ad hoc faculty group
Department chairperson
Persons or organization outside the university
Faculty advisor
Advisors (in student support services)
Students (graduate or undergraduate)
Dean/Director
Other:

11) How did they evaluate, analyze, or interpret the evidence? (Check all that apply.)

Used a rubric or scoring guide
Scored exams/tests/quizzes
Used professional judgment (no rubric or scoring guide used)
Compiled survey results
Used qualitative methods on interview, focus group, open-ended response data
External organization/person analyzed data (e.g., external organization administered and scored the nursing licensing exam)
Other:

12) Summarize the results of the assessment activities checked in question 6. For example, report the percent of students who achieved each SLO.

The results of our assessment efforts in the MA program continue to be generally positive, indicating that the program is functioning well. Where we find areas that need improvement, we make changes to improve the MA program curriculum. A key point here is that because of the existence and substantial use of “variable alpha” courses at the 600 level, we can easily put into place new content material, at a semester’s notice. Similarly, because of substantial use of seminar level courses, again, variability and flexibility in topic is possible. Besides formal connections in the assessment evaluation cycle, informal lines of communication are open between assessment specialists and instructors.

13) What best describes how the program used the results? (Check all that apply.)

Assessment procedure changes (SLOs, curriculum map, rubrics, evidence collected, sampling, communications with faculty, etc.)
Course changes (course content, pedagogy, courses offered, new course, pre-requisites, requirements)
Personnel or resource allocation changes
Program policy changes (e.g., admissions requirements, student probation policies, common course evaluation form)
Students' out-of-course experience changes (advising, co-curricular experiences, program website, program handbook, brown-bag lunches, workshops)
Celebration of student success!
Results indicated no action needed because students met expectations
Use is pending (typical reasons: insufficient number of students in population, evidence not evaluated or interpreted yet, faculty discussions continue)
Other:

14) Please briefly describe how the program used the results.

In the aggregate, student success in the program does not indicate the need for changes in the priogram. However, in the faculty's judgement, period curricular overhaul and ongoing smaller adjustments are nonetheless desirable in order to maintain the program's high quality and keep it an attractive choice for applicants.

Within the SLS Department, there is a considerable range of assessment-based information about MA student learning. We provide here again a few notes on its use:

1. Individual Students: Students naturally, and rightly, interpret class-based and project-based assessment(including feedback and grades) as an indication of the extent to which their work is approximating the academicand professional norms of the SLS discipline and the high expectations of the SLS department. Via assessment,students come to realize their accomplishments as well as gaps in their ongoing development, and they are enabled to focus their energies on closely articulated learning targets that make sense for the individualized paths that they take through their degree program.

2. Individual Faculty: Individual faculty members of SLS interpret class-based assessment data (from conventional exams, term papers, and project-based work, etc.) as an indication of the extent to which their courses are effectively fostering student learning towards specific targeted outcomes. Additionally, faculty interpret end-of-semester course evaluation data as an important indicator of the aspects of course design and delivery which are functioning as intended and those which may be in need of adjustment. Interpretations of assessment data here are about course and instructor contribution to learning outcomes, rather than merely about student achievement of learning outcomes.

3. Administrators in SLS: The Graduate Chair interprets course grades, scholarly paper/thesis, and other requirement completion data, in adjudicating final graduation decisions about individual students.

4. Scholarly Paper and Thesis Committees: These small committees of faculty members (usually of size 2 and 3respectively, though occasional MA theses committees may have an additional member) utilize the major scholarlywork requirement as a means for: (a) promoting a professional-grade research, writing, feedback, and final product cycle; and (b) ensuring that SLS students graduate with professional capabilities sufficient to their individualized needs and reflective of the high standards of the department. The committees work to determine the extent to which students’ research and writing reflect and approximate to professional disciplinary standards for publishable and worthwhile scholarly research on topics of importance to second language studies.

5. Graduate Faculty: The graduate faculty of SLS, working in informal groups most likely centered on an individualfaculty advisor, in specific cases led by the Graduate Chair, to identify individual students who may not be meetingexpectations, with the intent of providing feedback to those students. Any apparent patterns of progress or lack ofprogress towards key learning outcomes which appear to have broad rather than individual sources will naturallylead to larger-scale program and course changes through regular curriculum review processes.

6. The Departmental Assessment Committee: The two-member assessment committee reviews and interprets allforms of assessment activities for three basic purposes: (a) to make recommendations on revisions/additions toexisting assessment practices, where needed for acquiring more valid and/or useful data; (b) to makerecommendations regarding areas in need of attention in program/curriculum/course design to the GraduateFaculty (who have responsibility for the MA curriculum); and (c) to construct annual reports about assessment activities.

15) Beyond the results, were there additional conclusions or discoveries? This can include insights about assessment procedures, teaching and learning, and great achievements regarding program assessment in this reporting period.

The Department is fortunate to have among its full professors two specialists in language program assessmentwho also have general skills in educational program evaluation. Thus for more than twenty years, we have includedassessment and evaluation procedures and processes in our regular departmental activities with a high degreeof confidence in their utility. The conclusions and discoveries have been numerous and useful, and have led toperiodic overhaul of the MA curriculum, including most recently, in 2005, following the collection of survey data from students, a review of scholarly work and graduation data by faculty, and consideration of the evolving nature of the profession and SLS student demographics.

The highly flexible 600 and 700 level course structure is consistently responsive to ongoing assessments of thesuccess of the MA curriculum as it stands, and even a core introductory course such as SLS 600 has been offeredin quite distinctively different manifestations in recent years. Most notably, we have assessed the MA curriculum as in need of alternative delivery modes, and have moved to offer the introductory course as well as another core course (SLS 650, Second Language Acquisition) in an online-only option (or section) as well as in a conventional face-to-face option. Beginning F 2015, we will be shifting student feedback processes (reports and their archiving, and possibly analysis) into a more secure, anonymity-preserving modality.

16) If the program did not engage in assessment activities, please explain.