Developing useful evaluation practices in college foreign language programs, 2007 NFLRC Summer Institute

* Following are short statements and descriptions of projects, all of which were produced by the participants in the NFLRC Summer institute. Note that participants will be adding to this information over the next year (2007~2008) as their projects progress. If you would like to print this page, please download the original booklet.[download pdf file]

Introduction to the Summer Institute
Joint statement on the value of evaluation
Strategies for culture change


1. Developing and improving study abroad programs
- Russian: University of Washington
- Multiple languages: Linfield College


2. Effectiveness of innovation
- Korean: Yale University
- Spanish: University of Missouri St. Louis


3. Understanding program value
- German: Georgetown University, Hunter College CUNY


4. Institutional program review
- Multiple languages: Mount Saint Mary's College


5. Learner needs and curriculum improvement
- Arabic summer program: Georgetown University
- Spanish: U of South Carolina Beaufort


6. Program alignment with external benchmarks
- French: University of Oregon
- Multiple languages: Central Michigan University


7. Program development and advocacy
- Spanish: Cal. State University Monterey Bay
- Multiple languages: Cal. State University Los Angeles


8. Assessing student learning outcomes
- Spanish: University of Iowa
- Multiple languages: Duke University, University of Evansville,
University of Florida, University of Maryland Baltimore


9. Teacher and GTA development
- Arabic: University of Arizona
- Italian: Johns Hopkins University



back to top

From May 28 through June 6, 2007, 24 faculty members from a diversity of college foreign language departments across the U.S. participated in the 2007 Summer Institute of the National Foreign Language Resource Center at the University of Hawai‘i. The focus of this institute was Developing useful evaluation practices in college foreign language programs. Sponsored by the NFLRC (, in collaboration with John Norris’s federally funded Foreign Language Program Evaluation Project (, the goal of the institute was to help college foreign language administrators and faculty engage in useful, practical, and effective program evaluations to meet a variety of purposes. A secondary goal was to elevate the discourse about assessment and evaluation in higher education, and to engender nationwide improvements in the contributions made by foreign language education. In an era of zealous accountability testing, and in light of inevitable changes in the educational and societal roles being played by the foreign languages, the hope was to develop and support a cadre of FL professionals to lead the way towards a rationale and useful evaluation practice.

Participants in the institute surpassed all expectations. Working through the Memorial Day holiday and into the weekend, they received training in program evaluation concepts, models, and methods; they translated and applied evaluation ideas to the very real challenges of contemporary college FL education contexts; and they developed extended plans for the use of evaluation in response to unique questions in their own particular language programs. Above and beyond these planned institute activities, participants also engaged in extended discussions about the value of language education in the U.S., they articulated fundamental tensions that define much of the FL teaching and learning landscape, and they forged potential solutions based on sound evaluative practices. They even adopted a name for themselves: the Faculty Working Group on Foreign Language Program Evaluation.


In this booklet, the Faculty Working Group on Foreign Language Program Evaluation summarize three immediate outcomes of their work during the institute. First, they provide a statement on the Value of Evaluative Thinking and Action in Foreign Language Programs. Second, they offer broad Strategies for Culture Change in Program Evaluation. Third, they present one-page overviews of their program-specific evaluation work, as one way of demonstrating the range of possible contributions to be made by evaluation in college FL education. It is their hope that these collected ideas will serve as a springboard for dialogue and action within the FL disciplines.

On behalf of all of the summer institute’s organizers and facilitators, we congratulate the participants on their hard work and excellent achievements, and we look forward to sustained collaborations with them in the service of useful foreign language program evaluation.


John Norris, Director
Yukiko Watanabe, Assistant Director
NFLRC Summer Institute 2007

The Value of Evaluative Thinking and Action in Foreign Language Programs

back to top


A joint statement by the Faculty Working Group on Foreign Language Program Evaluation


This statement is the result of discussions among the participants in the NFLRC Summer Institute 2007. The intent is to articulate the value of evaluative thinking and action to foreign language education.

Evaluative thinking and action provides a framework for discussion in programs or departments about fundamental questions of program effectiveness. These discussions can have a democratizing and unifying effect—democratizing because all voices are heard, and unifying because the process leads to communication and consensus building. Collaborative discussion and action that involves all stakeholders results in a heightened commitment of all participants to the vitality of the program, thus contributing to a sense of academic community.

The evaluation process allows faculty members to understand the program as a whole and to articulate to themselves and others what they want students to achieve in the areas of knowledge, skills, and dispositions. By identifying strengths and weaknesses, they formulate a plan (or plans) of action to increase program effectiveness and maximize student learning. The goal is to make the learning process more efficient and to create a well-articulated curriculum that is responsive to changing circumstances, all within a cyclical process of innovation and evaluation.

Evaluative thinking and action has further benefits. It enables departments to address in action-oriented ways common problems at the program level, such as low enrollments in some languages, attrition at various levels, and difficulties in the curricular transition from lower-division to upper-division courses. It offers opportunities for individual faculty members to engage in professional development activities, such as scholarship in teaching and learning and improving teaching practices through ongoing reflection. It can increase communication across departments, leading to cross-pollination between disciplines and opportunities for collaboration with colleagues on evaluation projects, as well as professional activities in other areas.

Beyond the department level, evaluative thinking and action enables faculty members to enhance the profile of their program or department within the institution by establishing themselves as leaders in evaluation initiatives and showcasing the accomplishments of their evaluation-related projects. Such leadership activities position the program or department well in requests for support (e.g., funding, faculty lines). Finally, the ability to demonstrate cycles of innovation and evaluation empowers foreign language professionals, enabling them to make a strong case for the unique contributions of language studies in a pluralist and globalized world.

Strategies for Culture Change in Program Evaluation

back to top


A joint statement by the Faculty Working Group on Foreign Language Program Evaluation


This statement is the result of discussions among participants in the NFLRC Summer Institute 2007. The intent is to encourage the foreign language field to recognize program evaluation as indispensable for enhancing student learning and program quality, and to enable the field to articulate and demonstrate—internally and externally—the unique contributions of language studies in a pluralist and globalized world.


Strategies for changing perceptions of evaluation and enhancing value of evaluation

• Focus on program improvement as a goal of program evaluation.
• Emphasize the usefulness of evaluation for: (1) student learning, (2) program articulation, (3) departmental collaboration, and (4) academic community.
• Highlight the public, participatory, and inclusive nature of the evaluation process.
• Link evaluation goals to stated institutional priorities.


Strategies for encouraging faculty-led evaluation

• Build on program information (curriculum, syllabi, final exams, papers, etc.) and systematize evaluation work already conducted in the department.
• Lead institutional evaluation efforts by example; forge alliances across the institution; draw on available institutional resources.
• Appropriately recognize and incentivize evaluation work within the department and the institution.
• Integrate evaluation into standard administrative, curricular, and teaching practices.
• Pursue professional development opportunities and external funding.
• Generate and showcase successful examples of evaluation.


Strategies for professional organizations to enhance useful evaluation

• Recognize and disseminate successful models of program evaluation.
• Develop policy statements on useful program evaluation.
• Organize professional development events focusing on program evaluation.
• Facilitate the establishment of professional networks supporting program evaluation efforts.


N.B.: For examples of all of the above (and related resources), please monitor the Foreign Language Program Evaluation Project (FLPEP) web site:

Curricular Reform, Student Outcomes, and Identity of Majors:
A Case Study of German Program Outcomes Assessment


Heidi Byrnes (
Peter C. Pfeiffer (
Georgetown University, Department of German


back to top


I. Program description and overall purpose for evaluation

The Department’s curricular reform project “Developing Multiple Literacies” led to a unified set of educational goals for majors and non-majors alike. As a result, the question of the major’s identity, distinctiveness, and value arose. In order to evaluate this issue synchronically and diachronically, both current students and alumni are included in data gathering.


II. Intended uses, users, and evaluation questions

Intended uses
Primary uses of this evaluation are (a) to understand the distinctiveness and value of the German major from the perspectives of multiple program stakeholders; (b) to identify any needed changes, additions, and other adjustments in the targeted outcomes of the German major; (c) to provide a basis for curricular revision in the German studies degree program, if needed.

Its secondary uses are (a) to increase awareness and engagement among GUGD stakeholders about the nature and value of the German studies degree and the undergraduate curriculum; (b) to involve graduate students in work on the GUGD program in order to facilitate their understanding of and investment in the curriculum; (c) to provide an opportunity for teacher development through evaluation results; and (d) to sustain evaluation practices and awareness of GUGD efforts within the Georgetown institutional context and the broader national FL education context.


Intended users
Intended users are faculty; students; graduate students; alumni; administration.


Evaluation questions we posed are: (a) What are the characteristics of the major? (b) To what extent are they distinctive? (c) What values are attached to the major?


III. Methods, tools, and procedures

Three instruments were created for three distinct respondent groups: (a) web-based questionnaire with Likert-scale responses and open-ended responses (all students in German courses, Spring 2007); (b) focus groups (current majors at all levels, Spring 2007); (c) web-based questionnaire (methodology to be determined; alumni, AY 2007-08).


IV. Intended value added

Creation of an explicit basis for evaluation of the program and, as appropriate, for curricular, pedagogical, and administrative actions. Increased awareness internally; substantive representation of the program to outside stakeholders (administration; alumni; prospective students); modeling of FL program outcomes assessment to neighboring departments and the FL profession at large.

Note: Prof. Hiram Maxim was one of the lead PIs in the initial phase of this project until summer 2007.


Summative Evaluation of 4th Year Hybrid Korean Course

Seungja Kim Choi (
Yale University, Center for Language Study


back to top


I. Program description and overall purpose for evaluation

The Korean language program is situated in the Department of East Asian Languages and Literatures along with the Japanese and Chinese programs at Yale University. Compelled by the unique situational characteristics of the Korean program, such as small enrollment, diverse backgrounds of heritage students, limited faculty resources (2 full time), and the need for more advanced language courses due to the new foreign language requirement at Yale, web-based materials were constructed for a 4th year on-line/in-class hybrid course. The course, K154 will be offered this fall for the first time. The temporary course approval calls for the evaluation of this course.


II. Intended uses, users, and evaluation questions
The intended use for this evaluation is permanent course approval, and the primary intended users are Language Study Committee, New Course Committee, and the department. The New Course Committee wants to know whether or not this hybrid course is equal to a traditional course in terms of work load and effectiveness (student learning outcomes), and whether or not this course is different from the 3rd year course, K150. This evaluation will focus on assessing the delivery and effectiveness of the course in terms of student learning outcomes and work load by gathering information from the students in the course.


III. Methods, tools, and procedures

In order to measure the effectiveness of the course, the information on student learning outcomes will be gathered through (a) pre and post course essays, and (b) pre and post reading comprehension test developed by the faculty members. The information on work load and effectiveness will be gathered via student questionnaire, and focus group interview.


IV. Intended value added

(a) We can find students’ learning strategies in online material use, advantages and disadvantages of a hybrid course, and pedagogic strategies of teaching through a hybrid course.
(b) If the result of this evaluation shows that a hybrid language course at advanced level is comparable to or more effective than a traditional class in terms of delivery and student learning outcomes, more advanced literacy courses in other languages could be developed.
(c) Through detailed plans on student learning outcomes assessment, the evaluator’s ability in course design and articulation, and teacher training will grow.
(d) This evaluation might promote a program evaluation culture that is systematic and sustaining, which contributes to curriculum and faculty development at Yale.

Setting Benchmarks: First Year French Program

Hilary J. Fisher (
Coordinator, First-year French
University of Oregon, Department of Romance Languages


back to top

I. Program description and overall purpose for evaluation

The undergraduate French program for first and second year is designed and implemented by two coordinators, but without the benefit of a departmental mission or set of goals for student achievement. The overall purpose is to begin the process of establishing standards for student performance, through determining what students are learning and what they are able to do with the language at the end of the first year of instruction.


II. Intended uses, users, and evaluation questions
Intended Uses
(a) Determine whether the current first-year French curriculum brings students to the proficiency level recommended by the Oregon benchmarks.
(b) Plan revisions to the program based on the findings.

Expected users
First-year French teaching faculty, Graduate Teaching Fellows (GTFs) assigned to first-year program, second-year French program director, and Department Chair

Evaluation questions
(a) What parts of the current first-year curriculum teach to the topics and functions recommended by the Oregon benchmarks?
(b) What parts are missing from the current curriculum?
(c) How are the gaps going to be addressed?
(d) What is redundant in the current curriculum? Are those redundancies valuable?
(e) To what extent do the Oregon benchmarks reflect our ideas of what a first-year curriculum should be?
(f) Following revisions to the program, how will we know students are achieving the recommended benchmarks?


III. Methods, tools, and procedures
(a) Document analysis: First-year textbook, syllabus, Oregon benchmarks
(b) Map the current curriculum against the Oregon benchmarks
(c) E-mail questionnaire to first-year French language faculty and GTFs on expected skills and knowledge for first-year learners
(d) Conduct student focus group on current program
(e) Hold meeting of first-year teaching faculty to discuss findings and plan revisions


IV. Intended value added
(a) Improved articulation between the first and second year French program
(b) Highlighted status and importance of a first-year language program
(c) Improved student placements to increase efficient instruction to proficiency
(d) A mission statement for the 2-year French language program, and goals and objectives calculated to fulfill it

Using Program Evaluation to Make a Case for a New Spanish BA

Rafael Gómez (
María Zielina (
Frauke Loewensen (
California State University, Monterey Bay,
School of World Languages and Cultures, Spanish Program


back to top


I. Program description and overall purpose for evaluation
Currently, the Spanish program is housed in the School of World Languages and Cultures at California State University Monterey Bay and offers a bachelor’s degree in World Languages and Cultures. The purpose of the evaluation project is to add value to the existing program by collecting evidence in support of the creation a bachelor’s degree in Spanish.


II. Intended uses, users, and evaluation questions

The evaluation will be used in order to gain approval of the “Permission to Plan” document, a required first step in the creation of a new program at our university. The intended users include department faculty, Dean, Provost, Academic Senate, students and alumni, as well as the community. The evaluation questions to be addressed include What value does the Spanish BA program add to the university? What evidence exists that there is a need for the BA in Spanish? In what ways will the Spanish BA respond to the vision of the university?


III. Methods, tools, and procedures

The evaluation consists of three phases: data collection, data interpretation, and presentation of findings. Raw data will come from document review/analysis and surveys. The latter include phone interviews, web-based questionnaires, and focus groups and will request input from other CSU FL chairs as well as stakeholders (Department faculty, students and alumni, community). The survey questions will be pilot-tested. Document review includes statistics (such as enrolment data) as well as documents published by our campus and other comparable institutions about program description, learning outcomes, curriculum, and the university’s vision/mission. Data interpretation will be conducted in the context of a Strengths Weaknesses Opportunities and Threats analysis, i.e. examining the strengths and weaknesses of the suggested degree program, as well as opportunities and threats it presents to the existing structure. The last phase, the presentation of findings, is the completion of the “Permission to Plan” document.


IV. Intended value added

This project will provide decision makers with pertinent data that will help them make an informed decision about the suggested program change. It will foster a culture of evaluation at our institution, facilitate the exchange of ideas between different stakeholders, and respond to the vision of our institution.


Student Learning Outcomes Assessment

Antonio Grau Sempere (
University of Evansville, Department of Foreign Languages


back to top


I. Program description and overall purpose for evaluation
We have different programs in the Department of Foreign Languages at University of Evansville: there are programs that lead to a major degree (Spanish, French, German) and programs that offer only minor degrees (Japanese, Russian). These programs are part of a bigger program: Foreign Languages, which has 10 full time faculty members and 2 part time adjuncts. In our program, students are required to complete certain coursework requirements (including language, literature, culture, civilization and linguistics) and to complete at least one semester (or eight credits of coursework) of study abroad. In their last semester before graduation, students are also asked to complete an exit exam that tests their reading and listening comprehension as well as their oral and written expression. Other components of out program include our mission statement and objectives that focus on preparing students for the personal and professional challenges of a multicultural society and a global marketplace and a list of learning outcomes that our students are expected to achieve at graduation. In a nutshell our students are expected to understand, analyze and speak the language and to know about the culture, literature, history in which the language is a part of. The purpose of evaluation is the improvement of the quality of our program.


II. Intended uses, users, and evaluation questions
The intended users are faculty, students, and the Dean. The intended use is to improve our program by redesigning our curriculum, changing our teaching methodology, etc. The Dean may base any decisions (monetary, related to new faculty and new classes, etc.) related to our program based on the information produced by the evaluation. Students may take any necessary steps (additional classes, more time studying abroad, etc.) in case the student does not reach the intended level. Prospective students may base their decision on the success of our former students. The following are the questions for evaluation:


Q1: To what degree do our students graduate with the basic knowledge we want them to have?
Q2: To what degree are we providing them with the necessary resources for them to graduate with the basic knowledge we want them to have?
Q3: To what degree are our students able to compete with other students coming out of similar programs?


III. Methods, tools, and procedures
Q1: Structured interviews to be carried out in the student’s junior year. The student’s advisor and at least one more faculty member of UEFL program are present during the interview. The interviews are designed to elicit the student’s attitudes and self-evaluation of their own learning experience. The portfolio is assessed in their senior year possibly during the capstone course. The portfolio will include a variety or materials: (a) a self-reflection written piece that the student will have to present to his/her advisor and at least two more faculty members of UEFL program. This essay will have to address the student’s learning experience at UE and comments on the papers included in his/her portfolio (b) at least three term papers dealing with literature, linguistics and culture/history/current events (c) any other additional piece of writing (term paper, self-reflection piece, essay, creative writing, publication, etc.) that the student thinks it is important to include.


Q2: Analysis of assessment results. The pattern in students’ results will determine the outcomes that are receiving enough attention in our curriculum and the ones that need more support in terms of curriculum rearrangement.


Q3: Unstructured interviews with alumni and chairs of departments of other institutions our size.


IV. Intended value added
I consider the value of this project has to do with the definition I gave this assessment project itself: the ongoing process of reflecting upon what we do in our program, based on systematic and objective evidence, promoting internal (among faculty and students) and external (administrators) communication, information and education; and judge the effectiveness of our program to better determine its development and improvement.


Program Evaluation with Respect to State-Required Outcome Performance for Certification

Susan Knight (
Central Michigan University, Department of Foreign Languages, Literatures and Cultures


back to top


I. Program description and overall purpose for evaluation
The Department of Foreign Languages, Literatures, and Cultures of Central Michigan University offers a B.S. in Education for majors and minors of French, German, and Spanish. All FL language majors and minors on teaching degrees are presently required to pass a Simulated Oral Proficiency Test at the Intermediate-High level of the ACTFL scale before entering the FL methods course. This is an institution test, created and assessed by the 24 members of the FL department, some of whom have been formally, though minimally trained several years ago. It has become increasingly important to evaluate the effectiveness of our program’s speaking outcomes, especially since the State of Michigan has changed the FL teacher certification policy, and will next year require performance at the Advanced-Low (AL) level of the ACTFL OPI, a level also required for NCATE certification


The purposes of the evaluation project are (a) to determine how well the FL Department is preparing its majors and minors in education to pass the new state certification criteria, and (b) to determine what changes are needed to help students achieve that level.


II. Intended uses, users, and evaluation questions
Intended users are majors and minors in program and faculty of FL department. To a lesser degree, they include the Department of Education as well as high school programs and the future students these people will be teaching. Intended uses center upon improving (a) the certification percentage of graduating seniors, (b) the general FL program, (c) student satisfaction with program, and (d) student recruitment into the FL program. The following evaluation questions are posed:


Q1: Can graduating FL education majors and minors pass the OPI at the Advanced-Low level?
Q2: Where in the program are students getting the opportunity to develop the competence needed to pass this test?
Q3: What program changes are needed to assure this level of proficiency upon graduation?

III. Methods, tools, and procedures:
Question 1: Can students pass the OPI at the Advanced-Low level?
(a) Analysis and examination of existing documents and reports. How many students in the past have not passed parts of the Department’s SOPI? How many tries? What parts of the test have they passed and not passed? What were examiner comments (error analysis)?
(b) ACTFL OPI. Graduating seniors will take the OPI to determine their level. Results will be compared to how they scored on the institutions’ SOPI.


Question 2: Where in the program are opportunities to develop the competence needed to pass this test?
(a) Focus Groups (seniors) Sample questions: What classes helped them? What were tasks in these classes? Did they attend conversation hours, etc? What do they do outside of class to increase proficiency? Have they studied abroad? How many hours of contact do they have with TL each day? How did they prepare for the SOPI? Suggestions for improvement?
(b) Questionnaire followed by Meeting (faculty) to determine how proficiency is addressed in various classes (syllabi, materials, in and out of class tasks, methods of evaluation)
(c) SOPI at 200, 300 and 400 level for sample student groups.
(d) Samples of student oral production in various required courses at different levels evaluated by faculty.


Question 3: What changes are needed to insure this level of proficiency?
(a) Use of previously discussed Senior Focus Group and Faculty Discussion group
(b) Survey (of alumni now teaching) Possible questions: What helped prepare you to teach in the TL and specifically to speak proficiently? What would you suggest to improve the program?
(c) Analysis and examination of existing documents and reports. Examine syllabi to find out how course content and requirements relate to proficiency standards


IV. Intended value added
This will not only increase articulation between members of the faculty regarding overall student learning outcomes and how to achieve them, but also lead to a better articulated program as a whole. It will also allow possible opportunities for faculty presentations and research publications.

Student Satisfaction Survey to Measure Students’ Attitude towards Their Learning at Different Stages of Their German Studies

Annette Kym (
Hunter College, CUNY, German Department


back to top


I. Program description and overall purpose for evaluation

Hunter College, with an enrollment of approx. 20,000 students, is part of the City University of New York. The German Department is a freestanding department with 5 FT faculty members, 1 TA and a number of adjuncts. The average enrollment per semester is between 360 – 400 students, appr. 28 – 30% of those students are in upper-level courses taught in the target language. Hunter College has a 4-semester language requirement The German major requires 24 credits, the minor 12 credits.

We want to evaluate student satisfaction at various stages of our program. In this process, we want to find out what the value of learning outcomes is to students finishing the language requirement, and to students finishing upper-level courses. In addition, we want to find out how effective the department’s teaching approach is as judged by the students.


II. Intended uses, users, and evaluation questions

Intended uses on the departmental level
By engaging in this evaluation, we want to find evidence on students’ perceptions of the value of their language studies. The information will help the department to take actions to improve the program. It will also be used in discussions with the administration about resources allocated to the department.


Intended uses on the college level
We want to use the data in the broad discussion about the General Education Requirement, which is currently under review. The systematically collected data will produce evidence of the value of the FL requirement as perceived by the students. Furthermore, the college is undergoing accreditation by the Middle States Association and this type of information can be used for that purpose.


III. Methods, tools, and procedures

We plan to collect the data through focus groups with students and student surveys. We will run a pilot project in the fall semester 2007, and conduct the focus groups and administer the finalized survey in spring 2008. The data will be analyzed over summer 2008 (Middle States Review will take place in Fall 2008).


IV. Intended value added

The German Department has engaged in outcomes assessment for a number of years and we have data about student learning outcomes. However, we have not collected data from the students’ point of view and measured students’ perception about their learning outcomes systematically. Getting data on their perceptions will fill this gap, and in conjunction with other indicators, will help us to improve the program and assist us in long-term program planning.

Evaluation of a Spanish Pilot Project in the Center for Languages and Cultures

Beth Landers (
Director of the Center for Languages and Cultures,
University of Missouri-St. Louis


back to top


I. Program description and overall purpose for evaluation
The Center for Languages and Cultures is located within the department of Anthropology and Languages at the University of Missouri-St. Louis. UMSL is an urban, public research university with high numbers of non-traditional students who commute to campus. The Center is responsible for elementary and intermediate-level language courses in nine languages (Spanish, French, German, Japanese, Chinese, Modern Greek, Arabic, Latin and Ancient Greek) as well as ESL. Approximately 40% of all UMSL undergraduates have a three-semester foreign language requirement (all students who earn BA’s, as well as a few select other degrees). The Center serves these students, and also endeavors to retain students beyond the three-semester required sequence, preparing them to major or minor in language.


The specific component of the program to be evaluated is a Spanish Pilot Project initiated and funded by the College of Arts and Sciences as part of an initiative to improve all required classes in Math, Foreign Languages, and English. This particular project focused on ways to improve the oral and written proficiency of students enrolled in the first three semesters of language courses (the required sequence). At its inception, no provision was made to evaluate the pilot project but it was intended to inform curricular changes in all lower-division classes across language sections.


II. Intended uses, users, and evaluation questions
This evaluation will identify best practices from the project that can be generalized and applied to all sections of elementary French, German and Spanish. The users will thus be the Director of the Center and the thirteen instructors in these sections. Specific questions to be asked are: What did the project do that was different from the past? What was innovative? What was traditional? What worked well and why? What did the teachers like? What did the students like? How did the Pilot Project affect the oral and written proficiency levels of students? How did the Pilot Project affect student success rates? Did the Pilot Project affect student motivation?


III. Methods, tools, and procedures
In order to answer these questions we will use a variety of methods: 1) a questionnaire distributed to instructors 2) a focus group of the three instructors at the end of the project 3) analysis of video tapes of particularly innovative class sessions 4) a student questionnaire distributed to 3rd semester Spanish students in fall 2007, 5) comparison of OPI ratings of the control group and the pilot project group in their third semester, 6) comparison of compositions on final exams with compositions from previous years, 7) analysis of course materials developed over three semesters, 8) analysis of student course evaluations, 9) analysis of student attendance/absence patterns 10) a report from the office of Institutional Research about the pass/fail rate of students over the three semesters of the pilot project, 11) a report from the OIR on how many students from this group enrolled in Spanish 4 in spring 2008.


IV. Intended value added
This evaluation will allow us to target the best practices developed in the Spanish Pilot Project and incorporate them in all basic language classes in order to refresh teaching practices, bring a measure of standardization to basic language instruction, and improve student learning outcomes. We hope that doing so will increase the number of students who continue their language study beyond the third semester. Ultimately, we hope to encourage further investment in the Center for Languages and Cultures by assuring the College that its initial investment in the Spanish Pilot Project has been used in constructive ways.

Evaluating the Foreign Language Requirement at Duke University

Carolyn Lee (
Associate Professor of the Practice of Chinese
Ingeborg Walther (
Associate Professor of the Practice of German and incoming Assistant Dean of Trinity College
Duke University, Foreign Languages Departments (Asian and African Languages and Literature, Germanic Languages and Literature, Romance Studies, Slavic and Eurasian Studies)


back to top


I. Program description and overall purpose for evaluation
In the year 2000, Duke University instituted a new curriculum for undergraduates of Trinity College (the undergraduate college of the School of Arts and Sciences), which includes a foreign language requirement. Whereas the last several years have been devoted to addressing issues of implementation, the Dean of the College is now in the process of evaluating the curriculum. Most recently, a committee consisting of several language program directors and the director of the assessment office has been asked to determine a means of documenting student learning outcomes, for the purpose of understanding the extent to which students are achieving the goals articulated by the foreign language requirement.


II. Intended uses, users, and evaluation questions
The primary uses of the evaluation project are to demonstrate the merit and value of the foreign language requirement by documenting student learning outcomes, to chart student progress at various stages of their foreign language learning, and to improve foreign language programs and curricula. The information obtained from the evaluation project will primarily be used by university administrators to document what students are learning as a result of the foreign language requirement, and possibly to more clearly identify and understand how individual language programs prepare students to meet the foreign language requirement goals established by the university. The information gathered will be used by individual language program faculty to assess the extent to which their own programs and curricula are preparing students to meet these goals, for the purposes of engendering programmatic and curricular improvement. In particular, we would like to investigate the following questions: 1) What levels of language proficiency are students achieving in speaking, reading, and writing? 2) What cultural knowledge, understandings, and perspectives are students gaining at each level of the curriculum? 3) To what degree do factors such as language program curricula, study abroad, heritage, previous language learning experiences play a role in students’ learning outcomes? 4) What are student perceptions of the university foreign language requirement?


III. Methods, tools, and procedures
We will be using multiple methods to collect the information we need, including SOPI or COPI, course embedded assessments, student portfolios, online questionnaires, individual interviews and focus groups. We will most likely collect this information in stages: using the fall semester of 2007 for planning and piloting, the spring semester of 2008 for initial data collection, and the following academic year for analysis and additional data collection, as informed by decisions based on collected data.


IV. Intended value added
We imagine that the very process of engaging in the evaluation project will help individual language programs address issues of curricular goals and coherence, faculty development, and articulation with other programs and departments across the university. We expect that the project will help communicate more precisely to the outside world (administrators, faculty, students, parents, communities) just what it is our foreign language programs do and what students learn. This may enhance the profile of foreign language programs within the university and beyond as an integral part of the humanities in a liberal arts education. And finally, we hope that this project will contribute to the growing research in the field of foreign language education and program evaluation.

Assessment of Student Learning Outcomes for the Spanish Major

Judith E. Liskin-Gasparro (

University of Iowa, Department of Spanish and Portuguese


back to top


I. Program description and overall purpose for evaluation
The undergraduate major in Spanish consists of 12 upper-division courses: 1 course in each of 4 core areas (Peninsular literature, Spanish American literature, Culture, and Linguistics) and 8 elective courses. The purpose of evaluation is to provide evidence that will inform faculty discussion about the effectiveness of the curriculum and, thus, serve as the basis for making changes to enhance and improve student learning. The immediate impetus for the evaluation is the university-wide mandate to formulate an outcomes assessment plan for the major for the upcoming reaccreditation cycle.


II. Intended uses, users, and evaluation questions
Intended uses
(a) Make changes in the curriculum (types of courses, content of existing courses)
(b) Increase number of/access to curricular components most highly valued by students
(c) Review (and possibly adjust) the departmental hiring plan in light of the evaluation
(d) Identify additional areas that would benefit from evaluative attention
(e) Provide evidence of program quality and departmental commitment to improvement of student learning for internal and external proposals (e.g., faculty lines, funding)


Intended users: Faculty, students, administrators


Evaluation questions
Q1. How effective is our curriculum in teaching speaking skills?
Q2. How effective is our curriculum in teaching high-level writing skills?
Q3. Which components of the program do students value most highly as contributing to their learning?


III. Methods, tools, and procedures
(a) Speaking (Q1): Locally designed interview procedure that focuses on Intermediate Mid?Advanced. The outcomes assessment (OA) committee will produce an interview protocol and train a small number of faculty in the procedure. Speech samples will be designated as Meets Expectations (Intermediate High), Exceeds Expectations (Advanced Low or higher), or Does Not Meet Expectations (Intermediate Mid or lower).

(b) Content-based writing (Q2): Work samples (e.g., research papers) embedded in senior-level courses will be assessed for the ability of the writers to produce in Spanish (i) coherent analyses and interpretations of Spanish/Latin American literary texts or cultural phenomena, (ii) coherent summaries and evaluations of linguistic phenomena and construction of linguistic arguments, and (iii) coherent analyses of Spanish/Latin American films. Students will create electronic portfolios that include 2 papers that they believe represent their best work from 1 or more of their 3 highest-level courses. A sample of these papers will be assessed by the OA committee (Meets Expectations, Exceeds Expectations, and Does Not Meet Expectations) according to a rubric designed by the committee for this purpose.

(c) Student perceptions of value of program (Q3): Students will write for their portfolios a reflection essay, in which they will be asked to reflect on the aspects of the program that contributed most significantly to growth in their (i) intellectual/academic knowledge and analytical ability, (ii) linguistic skills, and (iii) understanding of diverse cultural perspectives. Focus groups will be held with a sample of graduating majors to address the same topics. Reflection essays and focus group transcripts will be analyzed for key themes related to points (i), (ii), and (iii).


IV. Intended value added
(a) Provide an evidentiary basis for understanding the program and how it functions (evaluation) and to produce/implement a plan for change (actions) to increase its effectiveness to enhance student learning.
(b) Communicate the value of our program to administrators, students, external agencies, etc.
(c) Provide a site for research on teaching and learning in foreign language programs.

Foreign Language Enrollment Growth Initiatives at Cal State L.A.

Sachiko Matsunaga (

Chair of Department of Modern Languages and Literatures,
California State University, Los Angeles


back to top


I. Program description and overall purpose for evaluation
Our Foreign Language Program, which belongs to the College of Arts and Letters, consists of MA programs in French and Spanish, BA programs in Chinese, French, Japanese, and Spanish, and Subject Matter Preparation programs in French, Japanese, and Spanish, as well as elementary level courses in Arabic, Italian, Korean, and Portuguese. The main components that constitute our Foreign Language Program are: courses, teaching, and students in pedagogical arrangement; staffing and finance (for instructors, space and equipment) in institutional arrangement.


II. Intended uses, users, and evaluation questions
Intended users are: primarily full-time faculty (CHIN; FREN; JAPN; SPAN), Department Chair, and Dean; secondarily part-time faculty, students, other departments, senior administrators, local schools, and communities.

Intended uses are (a) faculty act on the evaluation results (e.g., modify courses/programs; identify new practical opportunities); (b) the Department Chair seeks support from the Dean, who decides whether to advocate our FL Program to the senior administrators to fund new initiatives; (c) senior administrators decide whether to increase funding for our growth initiatives; and (d) local schools and communities decide whether to express support for potential changes to be made based on the evaluation findings and our plan.

The first question of evaluation is: What are other CSU campuses doing in serving and recruiting FL students? We ask this question in order to know why our overall enrollment has declined (our SFR is lower than the average CSU SFR). Possible reasons could be environmental, institutional, and/or programmatic. The findings will help us offer explanations to the administrators. The second question of evaluation is: In what areas can we expect enrollment growth? We ask this question in order to know what actions we need to take in order to increase FL enrollments on our campus. The findings will be used to request support from the administrators.


III. Methods, tools, and procedures
Methods that we will employ are: (a) Internet search for factual information about other CSU campuses; (b) questionnaire distributed to all CSU FL Department Chairs electronically to find out what they are doing in serving and recruiting students; (c) faculty meeting with the Department Chair as a facilitator to come up with areas for growth; (d) focus group discussions to investigate whether the areas for growth identified by faculty match the needs of learners (current majors and non-majors as well as alumni), local schools, and communities. Responses on the CSU survey will be collected electronically, and (follow-up) phone interviews will be conducted if necessary. We will use Strengths Weaknesses Opportunities and Threats analysis to analyze all data, and present the results to the Dean and senior administrators in order to convincingly communicate to them the importance of the role that we play within the University and the necessity of funding for the identified growth (with supporting evidence gathered from all stakeholders) to happen.


IV. Intended value added
If we successfully gain support from the administrators with these initiatives, we will be able to not only strengthen our existing programs for our majors with new hires, more space, and better equipment, but also further diversify our course offerings for non-majors. In addition, the role that we play within the University will be raised, possibly to the extent that the College or University might consider implementing a foreign language requirement. Furthermore, it will prepare the Department well for the program review that we need to start conducting 18 months from now.


Evaluating the Modern Languages and Linguistics Department at UMBC

Ana Oskoz (

Assistant Professor of Spanish,
University of Maryland Baltimore County, Modern Languages and Linguistics department


back to top


I. Program description and overall purpose for evaluation
The Modern Languages and Linguistics (MLL) department at the University Maryland Baltimore County (UMBC) emphasizes communication across boundaries. The MLL program has been praised for its innovative and interdisciplinary nature, but since 1985, the program has expanded in both scope and amount of students. Recently, the faculty started re-examining the curriculum for articulation purposes. Still, there is a need to frame the MLL curriculum within the university mission statement, and to find means of documenting learning outcomes that fit with the department and the university goals.


II. Intended uses, users, and evaluation questions
Intended uses
(a) To examine the extent to which MLL department’s (and each language area) mission statement and goals fit with the university mission statement and goals.
(b) To establish a common understanding in the MLL department (and in each language area) of “knowledge,” “skills,” and “dispositions.”
(c) To establish expected student learning outcomes in the MLL department (and in each language area).


Intended users
The intended users are faculty, current and prospective students, high schools, and administration.


Evaluation questions
(a) How do the MLL department’s objectives fit with the University mission?
(b) How do the Spanish area objectives fit with the MLL department mission?
(c) What is the common understanding of “knowledge,” “skills,” and “dispositions” that MLL students need?
(d) How does the curricula (and specific classes, curricula, syllabi) in the Spanish area provide students with the knowledge and skills needed to achieve the program objectives?
(e) The fifth question is divided into 2 questions:

1. What are the MLL department student outcomes regarding writing, speaking and culture?
2. How does the Spanish area determine whether Spanish area students have achieved expected program outcomes (writing, speaking and culture)?


(f) How well does the program communicate program goals to the administration, other departments at UMBC and students (current and prospective)?


III. Methods, tools, and procedures
The department will used different methods/tools to collect information needed such as (a) document analysis to understand alignment of mission statements, (b) discussions with faculty to define relevant concepts, (c) questionnaire to alumni to gain information regarding the validity of the program mission statement and perceived program value, (d) foci group with students to elicit information regarding program perceptions, program components (speaking, writing, culture, etc.), and relationship with faculty/program, (e) final papers and class discussions to observe accomplishment of culture learner outcomes, and (f) short questionnaires to administration and other departments to gain information regarding professional relationship with other departments, and administration.


IV. Intended value added
The intended value added of the evaluation is to increase awareness of how the MLL department addresses issues of curricular goals and coherence, and articulation with other programs and departments across the university. The evaluation will also emphasize the MLL program identity to outside stakeholders (administration; alumni; prospective students, etc.), and it will highlight its specific contributions to the students’ educational experience within and beyond UMBC.


Tailoring Language Majors’ Year Abroad Experience to Improve Outcomes

Violeta Ramsay (
Chris Keaveney (

Co-Chairs of Department of Modern Languages,
Linfield College


back to top


I. Program description and overall purpose for evaluation

The Department of Modern Languages at Linfield College offers minors and majors in French, German, Japanese, and Spanish (as well as first year courses in Chinese and Latin). The department has ten full-time faculty and 1-3 adjunct instructors per year. Language requirement for a B.A. degree is one year. The purpose of our evaluation is to improve the level of achievement of all majors.


II. Intended uses, users, and evaluation questions

The intended uses of evaluation are (a) to make sure all language majors have the same opportunities to reach the goals of the major; and (b) to improve students’ year-abroad experience. Our evaluation question we have posed is: What aspects of student’s preparation, the year abroad experience or the design of the capstone seminar cause wide disparities in levels of achievement among language majors?


III. Methods, tools, and procedures

These are the steps we take in the evaluation process:
(a) Design a manual for majors that contains information on all stages of the major program
(b) Select programs that best match our program and goals; establish communication with those programs, and ask students to select among them.
(c) Collect data from students to establish a basis for the observation of development: This includes samples of written work and an oral proficiency interview (OPI) before students leave.
(d) Increase contact with and supervision of students’ work while they’re abroad, creating a one-credit course to make it possible. The course calls for the following:


• Students samples of papers and two essays
• A second OPI by phone
• Initial work on senior project: choosing topic, making outline, collecting data
• A third OPI upon return to home campus


(e) Make changes in capstone seminar to allow for more contact with advisors as well as more time, and guidance during the writing, revisions and presentation of senior project.


The following timeline guides the procedures
(a) Majors manual is given to students when they declare a major and reviewed before departure
(b) Pre-departure data collection takes place during students’ 4th semester
(c) Data collection from abroad takes place during 5th and 6th semester
(d) Guidance in capstone seminar takes place during students’ 7th semester


IV. Intended value added

Improvement in the communication with and guidance for students during their year abroad (most critical year for the major). The junior year experience can be tailored to better suit the rigor and quality of our program at home.

Preparation for Department Evaluation:
Building Faculty Capacity in Self-Evaluation

Montserrat Reguant (

Mount Saint Mary’s College, Los Angeles, Department of Language and Culture


back to top


I. Brief description of the program and overall purpose for evaluation
To prepare the Department of Language and Culture (LC) at Mount Saint Mary’s College to respond to the cyclic evaluation required by the institution, and at the same time to build faculty capacity in self-evaluation in order to improve the effectiveness of the whole department.


II. Intended uses, users, and evaluation questions
The intended use of the evaluation activity is to improve the department as a whole over time, as preparation for the evaluation of department in spring of 2008. The intended users are faculty members of the Language and Culture Department and Academic Vice-President-Provost.

Evaluation question: How can the Department of LC self-improve over time, in preparation for the departmental evaluation in 2008 and build faculty capacity in self-evaluation?


III. Methods, tools, and procedures
We are going to utilize self-assessment, using given indicators by the Curriculum Committee on department goals, curriculum, faculty, and student assessment, and institution. Surveys have already been prepared. They will be distributed to the faculty of the LC department, filled out, and collected during the first department meeting of 2007-2008 academic year, August 27, 2007. Data will be analyzed in the following three weeks and action for improvement will start before the end of the fall semester. The following is a selection of indicators included in the surveys:


(a) Department goals: Do they address the needs of the students, the College Mission or strategic planning, long term planning, student assessment, and student retention?


(b) The Curriculum:
1. How does it support the department goals?
2. How does it meet the needs of majors and minors?
3. How does it compare to that of a similar institution?


(c) Faculty: Commitment to teaching, strong collegial environment, innovation in pedagogy, quality in course content, indicators of student success, on-going effort to strengthen the department, assessment of the department’s curriculum, integration of full-time and adjunct faculty, assessment of the department’s curriculum, assessment of the quality of the major

(d) Student Assessment:
1. The department objectives and assessment methods are aligned
2. Methods of assessment across sections
3. Embedded techniques of assessment across sections and/or courses
4. Student feedback is used in regular basis to assess the efficacy of student assessment in
the department A variety of methods are used as part of student assessment
5. Use of at least one type of student-driven student assessment method (e.g., electronic portfolio, senior capstone project, etc.)


(e) The Institution: Library offerings and services, tech resources and support, student support services, faculty support services, professional development and other enriching opportunities, equipment and facilities, and balance of full-time and adjunct faculty.


IV. Intended value added
This project will expand the culture of evaluation preparation for departmental self-evaluation as an improvement tool. This self-reflection for departmental improvement will ensure we are well prepared for the external evaluation and at the same time we hope to be able to build faculty capacity in self- evaluation to enhance student learning.


Development and Evaluation of Arabic Teacher Development Program

Martha Schulte-Nafeh (

University of Arizona,
Center of Middle Eastern Studies and Department of Near Eastern Studies


back to top


I. Program description and overall purpose for evaluation
The Arabic Graduate Teaching Assistants (GTAs) have all attended the same teacher training workshop in which a particular “vision” of best language teaching practices was presented and discussed. The purpose of the professional development program is to guide the GTAs to a fuller understanding of those practices and to increase their ability to implement them. The GTAs will engage in 6 teacher development activities: (a) participation and reflection on lesson planning meetings, (b) reflection on their own planning and teaching, (c) observation of colleagues, (d) videotaping and reflective viewing of their own teaching, (e) seeking formative student feedback on their teaching, and (f) being observed by the coordinator. The teacher development program is being developed “evaluatively”; That is, by considering evaluation questions throughout the planning process and by including an explicit plan for formative and summative evaluation of the program. The evaluation component of the plan is described below:


II. Intended uses, users, and evaluation questions

Intended uses
(a) To determine and document the success of the program in developing GTAs understanding of and ability to deliver language instruction that is in line with the vision.
(b) To identify areas of weakness/possible improvement of the plan from all stakeholders’ perspectives in order to further develop the plan.
(c) To inform the development of the vision itself as well as the way that it is taught to new GTAs.


Intended users
The Arabic GTAs, The ME language coordinator, the director of the Center for Middle Eastern Studies and the head of the Near Eastern Studies Department


Evaluation questions
(a) Does the development plan lead to increased teaching of Arabic based on the identified good teaching practices?
(b) Does the development plan lead to increased sense of agency?
(c) Does the development plan lead to an increase in reflective language teaching?
(d) Which of the activities in the plan are most useful and why?
(e) Is the development plan meeting the users (GTAs) needs?


III. Methods, tools, and procedures

A variety of data sources will be used to answer the questions including: qualitative analysis of GTAs written reflection on their lesson planning and teaching; questionnaires for GTAs in the middle of the training period and at the end; faculty classroom observations of GTAs teaching; student questionnaires


IV. Intended value added:

The evaluation process will provide information from students, GTAs, and other faculty which can contribute to the articulation and development of a program wide understanding of good teaching practices.

USCB Spanish Evaluation: “From Growing Pains to Student Gains”

Babet Villena-Alvarez (

University of South Carolina Beaufort, Foreign Language


back to top


I. Program description and overall purpose for evaluation
(a) Student Enrollment (approximate): Certificate (8 courses): 15 students; Minors (9 courses): 8 students; Majors (10 courses for Heritage track or 16 for General major): 13 students
(b) Institutional Background: USCB—newest baccalaureate degree-granting institution of state; 1,200 students in state’s fastest growing region.
(c) Program Description: Spanish degree is 9th degree approved by the South Carolina State Commission on Higher Education (SC-CHE) in January 2006.


The overall purpose is “to evaluate how well the Spanish curriculum at USCB prepares its students in the major, minor and certificate programs.” This evaluation is summative (to judge and value) and formative (to develop and improve).


II. Intended uses, users, and evaluation questions
Intended uses
(a) Improve Program (Quality of Spanish curriculum)
(b) Conduct Outcomes Assessment and Institutional Effectiveness (Quality of faculty)
(c) Promote Student Advancement, Retention, and Recruitment (Quality of students)
(d) Support other university programs (Quality of university curriculum).


Intended users
Spanish Faculty (Full-time & Part -time), Office of Institutional Effectiveness, Office of Academic Affairs, Spanish students, community (local businesses & offices).


Evaluation questions
Overall question: How well does the Spanish curriculum prepare its students in the major, minor and certificate programs?
(a) What do major/minor/certificate students need to know to communicate effectively in Spanish in their career choice?
(b) With respect to cultural competency, how capable are the students in distinguishing Spanish/Iberian cultures from the Hispanic/Latin American cultures?
(c) With respect to the major/minor/certificate in Spanish, what are the students’ goals and motivations?
(d) To what extent do students in the major/minor/certificate programs participate and enhance their cultural knowledge through activities and resources outside class (i.e., movies/audiovisual materials in the target language, club activities, international festivals and other events, and volunteer work where the target language is used)?


III. Methods, tools, and procedures
Survey – electronic and/or paper (conduct a pre-program and a post-program survey); exit interview with the Program Director or a Full-time Faculty; tests – in-course; testimonials – at the end of the exit interview; Institutional Effectiveness and Outcomes Assessment.


Survey instrument – two questionnaires; exit interview; list of cultural content items relating to the specific items to be evaluated that need to be embedded and incorporated in specific course syllabi and tests; form to be completed for testimonials; Institutional Effectiveness and Outcomes Assessment document for the Office of Institutional Effectiveness (For accreditation purposes).


IV. Intended value added
The main directive from the USCB administration for the coming academic year is to grow the Spanish Program. In the process, an effective evaluation plan needs to be embedded in the growth process to ensure on-going curricular improvement, to put in place faculty accountability, and to substantiate student language proficiency.

Evaluation of the Arabic and Persian Summer Language Institute,
Georgetown University

Kassem Wahba (

Georgetown University, Department of Arabic and Islamic Studies


back to top


I. Program description and overall purpose for evaluation

In collaboration with the Department of Arabic and Islamic Studies, the Summer School in the Department of Continuing Education offers an Arabic language program within the Arabic & Persian Summer Institute. The Arabic program is an undergraduate language program of intensive and non-intensive courses which last for ten weeks. The courses offered include basic, intermediate and advanced level Modern Standard Arabic and elementary classes in the Iraqi, Levantine, and Egyptian Arabic dialect. The number of students enrolled for Arabic and Persian is approximately 60 and 13, respectively. The students’ motivations for taking Arabic include wanting to work for the government, wanting to specialize in the Middle East, or fulfilling education requirements.

The overall purposes of evaluation are (a) to offer the administration information regarding the program’s curriculum structure and its impacts, and (b) to ascertain the effectiveness of the curriculum.


II. Intended uses, users, and evaluation questions

Intended uses
(a) To find out how well the current Arabic language curriculum meets students’ needs
(b) To identify what is missing in the curriculum and to propose a plan to implement necessary changes for improving the curriculum


Intended users
Various administrators responsible for the program at the level of the university, within the department of Arabic and Islamic Studies and the Summer School of Continuing Education.


Evaluation questions
(a) What are the learners’ needs in studying Arabic?
(b) What are the current student learning outcomes?
(c) Is there a match between the outcomes and the learners’ needs?
(d) What changes are needed in order to better respond to learners’ needs?


III. Methods, tools, and procedures

Needs analysis will be carried out to gather information with regard to administrations, faculty, and learners’ perceptions of the students’ learning needs. The tools to be used are: interviews and a questionnaire.


IV. Intended value added

(a) To increase awareness among the administration and the faculty of the potential need to modify the Arabic language curriculum to meet learners’ needs.
(b) To understand and improve the current program practice.
(c) To focus on the effectiveness of teaching and learning in terms of learning outcomes.

Identifying a Nationally Accepted Tool to Assess Student Learning Outcomes in the LCTL’s at the University of Florida

Ann Wehmeyer (

Associate Professor and Chair
University of Florida, Department of African & Asian Languages & Literatures


back to top


I. Program description and overall purpose for evaluation

The goal of the program is twofold. First, we wish to determine whether student outcomes in our foreign language programs are comparable to those in external foreign language programs. Second, we want to ensure that training capacities for nationally accepted proficiency testing are incorporated into all of our language programs (Akan, Amharic, Arabic, Chinese, Hebrew, Hindi, Japanese, Korean, Swahili, Vietnamese, Wolof, Xhosa, Yoruba).

II. Intended uses, users, and evaluation questions

The users of the evaluation program are department faculty, undergraduate and graduate students of language courses, area studies center directors, CIBER director, Dean of International Studies, and accreditation agencies. For faculty, the evaluation will serve to clarify our respective statements on student learning outcomes, and to identify a nationally recognized assessment tool which can be effectively applied to assess one or more of these outcomes. It will also enable us to incorporate into our language program features in common with external language programs. For students, the evaluation will serve to provide them with an opportunity to obtain nationally recognized language credentials. For administrators in related programs and centers, the evaluation will supply data that can be used to substantiate claims about the quality of our foreign language programs for purposes of grant application and reporting. For accreditation, the evaluation will serve to put in place an ongoing system of monitoring and assessment.


The overarching question underlying this evaluation program is to determine whether student outcomes in our language programs are comparable to those in external language programs. We will need to take the following steps in order to address this larger question: (a) What are the student learning outcomes in each of our respective language areas?, (b) What is the most commonly accepted assessment tool in each of the language areas?, (c) Is it feasible to apply this assessment tool to all students at selected levels of language? The answer to question (c) will require an investigation of fit to student learning outcomes, cost of materials, and cost of human resources investment in training and rating.


III. Methods, tools, and procedures

While languages that are part of a major have explicit student learning outcomes, those that are not yet part of a major lack formal statements in this regard. The language coordinator of each language unit will work with faculty in the unit to articulate, or revise, a formal statement of student learning outcomes. Secondly, each unit will determine which assessment tool among those currently available (e.g., for oral proficiency, ACTFL’s OPI, SOPI, COPI; for reading and writing proficiency, Oregon’s STAMP, etc.) is most commonly applied in their language area. In conjunction with the chair, each language unit will determine the costs of materials, and costs of human resources investment. Purchase of materials and training will commence along a timeline of feasibility. The selected assessment tool will be applied toward the end of the second year of language study, and also toward the end of the final semester of language study in those languages that are part of a major program. Results will be reported to all users.


IV. Intended value added

As faculty become knowledgeable about national assessment measures, their ability to incorporate skills targeted by nationally recognized tests into their respective language curricula will be enhanced, thereby bringing our programs in the LCTLs more in line with the goals of programs in the more commonly taught languages.

Use of Evaluation for Developing Russian Summer Abroad Program

Valentina Zaitseva, (

Lecturer, University of Washington,
Department of Slavic Languages and Literatures


back to top


I. Program description and overall purpose for evaluation

The UW Department of Slavic Languages and Literatures has made a decision to offer a Russian summer abroad program in Sochi and to model it after the existing Czech summer program in Prague (set as Exploration Seminar, 3 weeks, 6 credits). The overall purpose of the evaluation project is to evaluate and inform the development of Russian Summer Program Abroad (Sochi).


II. Intended uses, users, and evaluation questions

The intended uses of this project are (a) to establish how the Russian Summer Program would contribute to the goals and missions of the Department; and (b) to evaluate how the Summer Program in Sochi would meet the learners’ needs and satisfy the requirements of Social Science Research Council (SSRC) funding for Russian Summer programs.


The intended users of the evaluation are chair, administrator and senior faculty, Russian summer program faculty; SSRC, intermediate and advanced UW students, graduate students, and faculty and administrators in Sochi University.


Evaluation questions
(f) What are pros and cons for setting a 4-week intensive language program (45 contact hours equivalent of one quarter)? Which models for summer program would better meet the students’ needs, departmental standards, and SSRC requirements?
(g) What kind of curriculum would meet the set goals? How can we build in special projects in under-/graduate research using local Sochi opportunities?
(h) How can local Sochi opportunities be used in task-based language teaching or special socio-cultural projects curriculum?
(i) Which components of the Czech summer program in Prague and recruitment strategies are transferable to Russian program in Sochi?
(j) How would Russian Summer courses fit curricularly into the overall Slavic Department program? What implications for regular curriculum would Summer program entail?
(k) What outcome assessment should be implemented in the end of the Russian Summer program?


III. Methods, tools, and procedures

We are going to use learner needs analysis (through survey questionnaires and analysis of program requirements). We will collect pertinent data and perform a document analysis (UW program Abroad and other FL institutions websites, information about the Sochi University). Individual interviews, meetings and focus group discussions will help us to build consensus on length, intensity and curriculum of Russian summer program in Sochi. Informal preliminary meetings with faculty in Sochi in August, 2007 would outline the specific evaluative points to be incorporated into the next cycle of evaluative and program-building activities.


IV. Intended value added

While the development of the Russian Summer Program Abroad is in its very initial stage, the evaluative approach to program building helps to address a broad range of topics and issues in a practical and specific way, taking into consideration needs and improvement of the entire Slavic Department program.


Evaluating the Need for Formal Graduate L2 Teacher Training

Alessandro Zannirato (

Johns Hopkins University,
Department of German and Romance Languages and Literatures


back to top


I. Brief description of the program and overall purpose for evaluation
The Italian section offers both majors and minors in Italian studies and has a vibrant PhD program in Italian literature. Most of the doctoral candidates are required to start teaching undergraduate language courses right after being admitted into the PhD program. Many of them have limited training in L2 teaching. Presently, the only training that the GTAs receive takes place through weekly meetings with the Program Directors or coordinators and through a reflection on classroom observations. It may be wondered if this is sufficient, or whether some sort of more formalized training may be advisable.


II. Intended uses, users, and evaluation questions
Intended uses and users
Through this evaluation, it is endeavored to: (a) evaluate the usefulness of GTA formal training in L2 language teaching in order to improve the quality of the undergraduate program; (b) reflect on the graduate curriculum and weigh the possibility to include formal training in L2 language teaching; and (c) verify if the investment needed to introduce formal L2 language teacher training would be justifiable. The primary intended users are: Language Program Directors, Section Heads and the Department Chair. Secondary intended users would include the Dean’s Office and graduate students.


Evaluation questions
The evaluation questions that were asked are: (a) Is it possible to improve the quality of our undergraduate courses through structured GTA training in L2 teaching? (b) Can we make the GTAs’ academic experience a better one? (c) Could we provide our graduate students with skills and formal training to become more competitive in the job market? (d) Is our department following nationally-recommended practices with regard to graduate student teacher training?


III. Methods, tools, and procedures
Document analysis, focus groups, meetings and questionnaires will be used. We will create an online questionnaire using Surveymonkey software, and review the journal articles from the Chronicle of Higher Education. The following are the steps we are going to take.


(a) Program directors could review last year’s teaching observation reports across the different languages, identify the repeating themes in order to get an idea about the quality of GTA teaching and formulate recommendations.
(b) A focus group with one GTA from each language could be organized in order to prepare a questionnaire to be distributed to all GTA, with the objective to verify whether they are satisfied with their teaching abilities. GTA responses may then be triangulated with program directors’ recommendations.
(c) Administer a survey to literature faculty members to verify their perceptions of L2 language teacher training and its relevance in the graduate curriculum.
(d) Review recent job advertisements to ascertain whether formal training in L2 language teaching is considered as a necessary or desirable qualification by search committees.
(e) Identify a representative sample of peer institutions and review what the role of L2 teacher training is in their curricula. Review FL professional organizations’ recommendations (e.g. MLA/ACTFL etc.).


IV. Intended value added
This evaluation will hopefully contribute to a better understanding of the role applied linguistics can play in today’s foreign language departments.



©2007 John Norris, Yukiko Watanabe, Marta Gonzalez-Lloret & Hye Ri Joo