Assessment Narrative - Seattle University




This assessment model is part of the WPA Assessment Gallery and Resources and is intended to demonstrate how the principles articulated in the NCTE-WPA White Paper on Writing Assessment in Colleges and Universities are reflected in different assessments. Together, the White Paper and assessment models illustrate that good assessment reflect research-based principles rooted in the discipline, is locally determined, and is used to improve teaching and learning.

 

Assessment Narrative - Seattle University

Institution: Seattle University
Type of Writing Program:  Writing in the Disciplines
Contact Information: John C. Bean
 Consulting Professor for Writing and Assessment
 Department of English
 Seattle University
 Seattle, WA 98122
 206 5296-5421
 jbean@seattleu.edu

Assessment Background and Research Question

Our research question is simple: to what extent do seniors in each undergraduate major produce “expert insider prose” in their disciplines? (For the term expert insider prose, see later references to Susan Peck MacDonald.)

Seattle University has no formalized “W-course” program in either WAC or WID. Rather, we have a Core Curriculum that requires “a substantial amount of writing” in every core course. When students enter their majors, instructors in each field assume responsibility for teaching students how to think and write within the discipline. The assessment movement on our campus has encouraged departmental faculty to think systematically about how students learn to produce disciplinary discourse. Initially driven by accreditation pressure, we soon discovered how the assessment process could lead to improvement of assignments, instructional methods, and curriculum design. We discovered particularly that assessment could help departments achieve better vertical integration of their curricula and lead to higher-quality capstone writing projects from their students.

Our approach to assessment adapts insights from three theoretical perspectives:

  • The assessment strategy of the “embedded assignment” developed by Barbara Walvoord and Virginia Anderson in Effective Grading (see also Walvoord’s Assessment Clear and Simple). In this approach to assessment, the basic assessment act is the individual instructor’s grading of students’ performance on an assignment already embedded in a course. The instructor develops a rubric to grade the assignment and to report results to departmental faculty. The process for this strategy is described later in this document.
  • Susan Peck MacDonald’s theory of how students’ growth as writers progresses in stages from “pseudo-academic prose” upon entry into college through “generalized academic” writing and “novice approximations” of disciplinary ways of writing to (one hopes) “expert, insider prose” in the senior year. (See Professional Writing in the Humanities and Social Sciences, p 187.) One goal of our writing program is to promote students’ growth toward expert insider prose as revealed in disciplinary capstone projects in their senior year.
  • A rising junior writing assessment modeled after the work of Richard Haswell, Bill Condon, and their colleagues at Washington State University. Because of the way our Core is designed, we have placed this assessment at the beginning of the sophomore year rather than the junior year. It consists of both an impromptu timed essay and a first-year seminar assessment made by each student’s first-year seminar instructor (our adaptation of the portfolio requirement in the WSU model). The goal of this mid-career assessment, which is still under development, is to identify weak writers and provide them with extra instruction and support before they enter their majors.

Assessment Methods

Our method for assessing writing in the majors is surprisingly simple. Currently the method has been implemented primarily in finance, chemistry, history, economics, and English. In 2007–2008, through a planning grant from the Teagle Foundation, it is being extended to political science, and more and more departments are interested in trying it or are already doing their own variations.

Using this method, a department’s first task is to create learning outcomes for the major. Almost always, one of these outcomes asks students to produce some kind of professional paper within the discipline. MacDonald’s stage theory of writing development helps focus departmental discussions: what constitutes expert insider prose for undergraduates within our discipline? The resulting disciplinary descriptions of “expert insider prose” map well on the taxonomy of genres identified by Michael Carter in his excellent CCC article “Ways of Knowing, Doing, and Writing in the Disciplines.”

After a department has defined the kinds of expert insider prose it expects from seniors, it initiates an assessment process as follows:

  • The department chooses a senior-level course in which the designated kind of expert insider prose is required (the embedded assignment).
  • The instructor grades the assignment using a rubric. (Sometimes departmental faculty work together to create the rubric.)
  • The instructor analyzes rubric scores to uncover patterns of strengths and weaknesses in student performance.
  • The instructor presents the results at a department meeting, initiating faculty discussion. (In some cases, randomly selected sample papers are graded by the whole department in a departmental norming session.)
  • The department discusses strategies that might be implemented earlier in the curriculum (new kinds of assignments, additional instructional units/modules, redesign of a course) to ameliorate weaknesses. This is the essential “feedback loop” stage of the assessment process.
  • The department tries out the new methods, often deciding to use for the following year’s assessment project an embedded assignment from earlier in the curriculum (what MacDonald’s theory would identify as the “novice approximation” stage of student development). The purpose of this follow-up activity is to determine the effectiveness of the new assignments or instructional methods.

The power of MacDonald’s stage theory is that it helps departmental faculty appreciate the importance of early courses in their major for teaching disciplinary discourse. To improve disciplinary writing in the senior year, faculty need to teach disciplinary methods of inquiry, research, and argument in their sophomore- and junior-level courses through better assignments and instruction. Moreover, this approach has led many departments to coordinate with research librarians to develop structured assignments for teaching discipline-specific information literacy.

One should note that the assessment process just described places almost no emphasis on high-stakes testing or on accountability. Departments are not trying to weed out weak writers or to provide administrators with statistical evidence that the department’s graduates are meeting certain standards. Rather, the goal is to discover weaknesses in senior-level papers and to make changes in curriculum and instruction to address them. The process and the data are entirely owned by the department.

Assessment Principles 

We believe that our program for assessing writing in the majors follows the best principles of assessment identified by the National Council of Teachers of English (NCTE) and the Council of Writing Program Administrators (WPA). It is low stakes, directly tied to improvement of teaching practice, locally designed and implemented, inherently social, authentic, performance-based, and aimed at aligning testing and curriculum. What we like about this approach is that it has no direct consequences on individual students; rather, it focuses faculty attention on characteristic patterns of weakness in student performance and creates discussion of how to ameliorate them. These discussions often focus on widely encountered problems (e.g., students not understanding the demands of a disciplinary genre) as well as on particular problems associated with second language speakers or persons with disabilities. Often extra support is provided for weaker writers through the Learning Center of the university’s peer-tutoring writing center.

Assessment Results

As can be expected from our decentralized approach, each department has its own assessment story. Initial departmental discussions often reveal faculty disagreement about what constitutes “expert insider prose” for undergraduates. Professors often realize that they haven’t been explicit about “insider” features and that their assignments sometimes evoke what MacDonald would call “pseudo-academic” writing rather than disciplinary arguments. The resulting discussions have typically led to clarification of expectations for seniors and to the “backward design” of the curriculum whereby departments have made changes earlier in the curriculum to teach the processes of inquiry, thinking, and research needed for capstone papers. Here are some examples:

  • The history department, unhappy with senior papers that were often narratives without theoretical sophistication, changed its sophomore-level gateway course to introduce historical theory and to develop new kinds of assignments. Particularly, faculty wanted students to apply different theoretical perspectives to historical problems and to create thesis-governed historical arguments using primary sources and archival data.
  • The chemistry department decided that the typical lab report was “pseudo-academic prose” that didn’t teach students to construct themselves as scientists. Two professors redesigned the labs for sophomore organic chemistry in order to teach the empirical research report in the manner of professional chemists. They discovered that the introduction to a scientific paper requires the highest level of contextualized critical thinking since the problem being investigated is always connected in complex ways to unknowns identified in a review of the literature. Discussions of these issues within the whole department have been one of the rewards of this approach to assessment.
  • The English department has redesigned its curriculum so that every 400-level literature course requires a researched literary argument informed by theory and aimed at presentation at an undergraduate research conference. Using the principles of backward design, the department has created an integrated sequence of writing assignments that increase in complexity from 200-level survey courses (where the focus is on close reading and formalist analysis) through 300-level courses (where one of the required courses introduces postmodern theory and explicitly teaches students how to position their own views in a conversation of critics) to the final capstone papers in 400-level courses.
  • The economics department discovered that economics majors, unlike professional economists, did not instinctively draw graphs on the backs of envelopes in the first 30 seconds of economic discussions. The department’s assessment focus has been on “rhetorical mathematics”—increasing students’ ability to interpret graphs and to construct graphs that tell a significant economic story. This approach has led to new kinds of numbers-based writing assignments throughout the curriculum.
  • The finance department has defined its capstone projects as short persuasive memos, addressed to specified audiences, arguing for a “best solution” to an ill-structured (open-ended) finance problem. Because finance professionals must frequently address lay audiences as well as finance experts, the department is especially interested in students’ ability to shift audiences, constructing some arguments in an expert-to-lay context (with appropriate use of language and graphics) and some in expert-to-expert context. Faculty are beginning to add new kinds of writing assignments to the curriculum.

Assessment Follow-Up Activities

As explained earlier, we have used our assessment data primarily to drive a robust feedback loop process so that assessment data lead to improvements in curricula and instruction. Our approach has led to a planning grant from the Teagle Foundation (jointly with Gonzaga University) in which we are attempting to use embedded reflection assignments to assess the impact of our Catholic/Jesuit mission on students’ commitment to social and environmental justice in a broad multicultural context.  

In terms of accreditation, we have yet to test this approach to assessment in a full-blown accreditation review. We are confident, however, that our approach—despite its lack of psychometric benchmark data—will meet with approval.

Assessment Resources

Our basic approach to assessment of writing in the majors requires minimal resources or faculty time. We ask departments to spend one department meeting per year discussing the results of an embedded assignment project. Because the project itself uses an assignment already embedded in an instructor’s class, the instructor’s “extra time” consists of creating a rubric (although many instructors already use well-designed rubrics), analyzing the rubric data for patterns of strengths and weaknesses, and preparing a short report for the department. What often requires extra time and resources is the feedback loop if the department wants to make significant changes in curricula or instruction. But this kind of work is already embedded in the everyday lives of professors with strong commitment to students and to teaching. In the early days of our assessment initiatives, some departments received inhouse grants to fund departmental projects—mostly used to provide food for meetings or stipends for a short summer workshop. But in general, this process can proceed without additional funding. (In contrast to our methods for assessing writing in the majors, our mid-career writing assessment, mentioned earlier, has required considerable university resources for administering the impromptu essay and for paying readers for attending norming sessions and doing the scoring.)

Sustainability/Adapatability

The embedded assignment approach seems easy to adapt to any setting, as has been shown by Walvoord and her colleagues in their influential publications. In fact, what hinders the embedded assignment approach, ironically enough, is faculty belief that authentic assessment needs to involve more work.

References

Bean, John C., David Carrithers, and Theresa Earenfight. “How University Outcomes Assessment Has Revitalized Writing-Across-the-Curriculum at Seattle University.” WAC Journal: Writing Across the Curriculum 16 (2005): 5–21.
Bean, John C., and Nalini Iyer. “‘I Couldn’t Find an Article That Answered My Question’: Teaching the Construction of Meaning in Undergraduate Literary Research.” Teaching Literary Research. Ed. Steven R. Harris and Kathy Johnson. New York: American Library Association [forthcoming].
Carrithers, David, and John C. Bean. “Using a Client Memo to Assess Critical Thinking of Finance Majors.” Business Communication Quarterly [in press].
Carrithers, David, Teresa Ling, and John C. Bean. “Messy Problems and Lay Audiences: Teaching Critical Thinking within the Finance Curriculum.” Business Communication Quarterly [forthcoming].
Carter, Michael. “Ways of Knowing, Doing, and Writing in the Disciplines.” College Composition and Communication 58.3 (2007): 385–418.
MacDonald, Susan Peck. Professional Writing in the Humanities and Social Sciences. Carbondale: Southern Illinois UP, 1994.
Walvoord, Barabara. Assessment Clear and Simple: A Practical Guide for Institutions, Departments, and General Education. San Francisco: Jossey-Bass, 2004
Walvoord, Barbara, and Virginia Anderson. Effective Grading: A Tool for Learning and Assessment. San Francisco: Jossey-Bass, 1998.