Chris Anson Plenary Address Discussion

At the 2006 WPA Conference in Chatanooga, in keeping with the conference themes, our three plenary speakers presented us with perspectives on the past of the organization, and provided us with an overview of some possible oppportunities to look out for in the future. In the second plenary, Chris Anson challenged us to extend the research we do in ways that provide a solid foundation for our work--and as a way of responding to public challenges to that work with real data. I hope that those of you who were present for this plenary address will continue the discussions we began at the conference by adding to this forum your recollections and notes from the breakout groups, as well as your further thoughts about the issues raised in this plenary. I also hope that others, whether able to attend the conference or not, might help us to continue (and to archive) our discussion on the topics raised by Chris' address.

Comments

The members of our discussion group at the WPA conference were quite compelled by Chris's overarching theme: that we as a field need at least to balance our qualitative and introspective scholarship with empirical studies.

However, members of our group also raised a few concerns and questions. We wondered whether the social sciences, or even data-driven professional practices like business or medicine, might not provide better models for data-driven studies of writing than the hard sciences (Chris's comparison between composition and biology). And while we agreed that, ultimately, it would be a good thing if (some) graduate students in the field learned empirical methods, we also wondered who'd teach those methods to them, and how such students would fare professionally, especially early in their careers.

Our group's response to Chris Anson's address centered mainly on empirical research, specifically how WPAs who are not currently trained in empirical research methods might acquire such training. I'll summarize the group's thoughts/comments below:

EMPIRICAL RESEARCH
--How might the CWPA help WPAs to become proficient in the language and methods of empirical research? (This theme seemed to surface throughout the conference)
--We would LOVE to see a one-day institute (and possibly a longer one) on empirical research methods at the 2007 WPA. We imagined that a first step might be an introduction to Statistics--how to use them, how to interpret them, and how to frame them rhetorically.
--Ideally, we might conceive of this project in multiple steps, beginning with an intro to statistics (2007 WPA), then possibly supporting WPAs through a series of developmental steps. Perhaps WPA could hold a CCCC workshop (2008) where participants could build on the intro, bringing in proposed study designs and getting support conceiving of these studies.
--To close the loop, perhaps there could be a panel at the 2008 WPA presenting findings and work in progress.

THE AUDIENCE FOR RESEARCH
--As these kinds of studies are conceptualized, there's a need to develop studies that might appeal to both local institutional audiences and audiences in the profession.
--We discussed the way institutional review boards approve of local assessment, but require IRB clearance for broader dissemination of results. (Cynical Aside: is this a way to keep local findings private?)
--One of the challenges of assessing a writing program involves the need to be able to study student populations broadly, something that human subjects protocol can militate against (i.e. students may withhold permission but it's important to be able to track broad programmatic trends; how can we educate IRBs to allow this work to be done?)

OPPORTUNITIES FOR COLLABORATION
--WPAs need assistance learning of research methods outside of the field, specifically in the different areas of psychology and social sciences: educational psych, linguistics, behavior, cognitive
--WPA as an organization might sponsor and support these initiatives

Our group also agreed on the need for a research institute, as well as the need to replicate studies. (I won't go into much detail here--everything's been pretty much said.) This topic was also discussed in Joe Janangelo's session following the breakout groups.

Our break-out session group was inspired by Chris' talk to ask some questions of our own:

How do we make research findings understandable and interesting to the public? When should we communicate these findings? And where should we publish them?

Furthermore, why aren't we more proactive in writing press releases to publicize our research? One participant mentioned that Nan Miller’s report is posted on a right-wing website and was discussed on a right-wing radio station, and we don’t tend to publish in these venues and participate in these conversations. Unless we make our research more public, in the "right" places, it might not matter.

Another participant asked, who is the audience for our research? What is our motivation for doing research? And do our “traditional” motivations conflict with communicating research results effectively to the public?

In response, we talked about how we don’t seem to agree as a discipline on the kind of research we should be doing and to whom we should be communicating the results of that research. One attendee gave the example that there isn’t a standard research methods textbook to use with graduate students.

We discussed how we train our graduate students in Rhet/Comp to problematize—to see all of the potential problems in an empirical design—so they often learn to simply dismiss the design instead of engaging in the research. We don’t train students to engage in empirical research.

We also discussed concern for faculty who work at institutions where there is not much support (or time) for a research agenda, such as two-year colleges. How would these faculty find the support to participate in this type of research? Can we promote cross-institutional research to share resources?

Here is my point of entry, loosely paraphrased from Chris' talk.
-----
Belief is at the heart of this ideological warfare, not fact.

How should we respond? Do we respond with beliefs? No, we respond with data from research.
-----
In the group discussion after the talk, I pointed out how Chris' plan differs from George Lakoff's current work in reframing public discourse. That is in coming up with new frameworks for capturing the public's belief, frameworks that don't necessarily carry a conservative agenda, but instead appeal to the public's sense of making a decent living available to all Americans. Chris wants to make sure we have the research available to back up generalizations, but we also need to be able to cast those generalizations in frameworks that will appeal to a public without patience to wade through evidence. Or maybe I misunderstand, and we're talking about only a segment of the public who can appreciate research findings.

So, through what frameworks can we interpret and disseminate research findings to a skeptical public and tightfisted elected officials so that they will trust our good will, public service, and respect for tax and tuition payers?

Michael Day

Our break-out session group was inspired by Chris' talk to ask some questions of our own:

How do we make research findings understandable and interesting to the public? When should we communicate these findings? And where should we publish them?

Furthermore, why aren't we more proactive in writing press releases to publicize our research? One participant mentioned that Nan Miller’s report is posted on a right-wing website and was discussed on a right-wing radio station, and we don’t tend to publish in these venues and participate in these conversations. Unless we make our research more public, in the "right" places, it might not matter.

Another participant asked, who is the audience for our research? What is our motivation for doing research? And do our “traditional” motivations conflict with communicating research results effectively to the public?

In response, we talked about how we don’t seem to agree as a discipline on the kind of research we should be doing and to whom we should be communicating the results of that research. One attendee gave the example that there isn’t a standard research methods textbook to use with graduate students.

We discussed how we train our graduate students in Rhet/Comp to problematize—to see all of the potential problems in an empirical design—so they often learn to simply dismiss the design instead of engaging in the research. We don’t train students to engage in empirical research.

We also discussed concern for faculty who work at institutions where there is not much support (or time) for a research agenda, such as two-year colleges. How would these faculty find the support to participate in this type of research? Can we promote cross-institutional research to share resources?

----------------------------------
Email - labs.beanz[at]gmail.com
AIM - sic
Yahoo - 2bz
How To Start a Petition -=InK=-