PR AXIS
!
a writing center journal
15.1: TWO-YEAR COLLEGES
VOL 15, NO 1 (2017): TWO-YEAR COLLEGES TABLE OF CONTENTS ABOUT THE AUTHORS COLUMNS From the Special Editor Genie Giaimo Drafting a Writing Center: Defining Ourselves through Outcomes and Assessment Joshua Geist and Megan Baptista Geist Cultivating Professional Writing Tutor Identities at a Two-Year College Alison Bright Creative Staffing for the Community College Writing Center in an Era of Outsourced Education Jill Reglin The Bronx Community College Presents: "Who We Are" Jan Robertson
FOCUS ARTICLES “At First It Was Annoying”: Results from Requiring Writers in Developmental Courses to Visit the Writing Center Wendy Pfrenger, Rachael N. Blasiman, and James Winter Can We Talk: Institutional Outcomes Assessment of a Genre-Analysis Consultation Approach Brett Griffiths, Randall Hickman, and Sebastian Zoellner “Our Students Can Do That”: Peer Writing Tutors at the Two-Year College Clint Gardner Focusing on the Blind Spots: RAD-based assessment of Students' Perceptions of Community College Writing Centers Genie Giaimo
ABOUT THE AUTHORS Rachael Blasiman, Ph.D., is an Assistant Professor of Psychology at Kent State University's Salem campus. As an experimental cognitive psychologist, she is primarily interested in learning, memory, and student success. Her past work examines student study habits and interventions to improve learning and academic performance. Alison Bright, Ph.D., is a lecturer in the University Writing Program at the University of California, Davis. Her research interests include the preparation and professional development of writing tutors, teaching assistants, and pre-service teachers. Her research has been published in Teaching/Writing and English Education. Clint Gardner, M.A., is the Program Manager of College Writing and Reading Centers at Salt Lake Community College. He is currently the President-Elect of the Rocky Mountain Writing Centers Association, and recently served as President of the Board of Trustees of the National Conference of Peer Tutoring in Writing. He has been Archivist for the Two-year College Association of the National Council of Teachers of English (TYCA) since 2011, and served as TYCA-West National Representative (2011-2013) and Secretary of TYCA (2008-2011). His other professional leadership includes the International Writing Centers Association (President, 2005-2007). His writing center research and development include the work of peer consultants (tutors) in community college writing centers. Joshua Geist, M.F.A., has been tutoring in various Writing Centers since 2001, when as a freshman undergrad he took a sequence of training courses for Writing Center tutors, about which he had many opinions and became very frazzled. He now serves as the Director of the Writing Center at College of the Sequoias in Visalia, California, where his primary job duties include teaching a sequence of training courses for Writing Center tutors, having opinions, and being frazzled. Megan Baptista Geist, M.F.A., is the Writing Center Coordinator at College of the Sequoias in California’s central valley, where she oversees twenty tutors in writing centers on three campuses. Her passions include acceleration pedagogy and outcomes assessment. Genie Giaimo, Ph.D., is the current Director of The Ohio State University Writing Center. Before her recent arrival to OSU, she was Assistant Professor of English and Director of the Writing Centers at Bristol Community College. Her research applies RAD-based methodologies to largescale and often systemic issues within writing center administration, such as student perceptions of the writing center in open access institutions, or the impact of ordinary and extraordinary stress on writing center workers. She has published articles in peer-reviewed journals such as Language and Literature, Literature and Medicine, and European Journal of Life Writing and chapters in edited collections on pedagogy and literary studies. She is also the special editor of the Praxis issue on Two Year College Writing Centers. Brett Griffiths, Ph.D., is the current director of the Macomb Reading and Writing Centers at Macomb Community College. She completed her PhD in composition studies in the Joint Program for English and Education at the University of Michigan. Her primary research interests are on writing instruction at access institutions, first-year writing, and departmental and institutional
approaches that support quality instruction and student success. She has published work on mathematics education and the effects of poverty on writing instruction at community colleges. She also publishes creative non-fiction and poetry. Randall Hickman, Ph.D., has more than two decades of work experience in educational research, at the University of Texas Research and Development Center for Teacher Education, and institutional research, at Central Piedmont Community College (in Charlotte, North Carolina), at Tri-County Technical College (in Anderson, South Carolina), and at Macomb Community College (in Warren, Michigan). He has published articles on workforce development, developmental education, and research methodologies and has presented research results at numerous conferences. Wendy Pfrenger, M.A., teaches English and coordinates the Learning Center at Kent State University Salem. She is interested in improving retention and completion rates for rural students. She also directs the Kent State University Rural Scholars Program, a post-secondary opportunity program for middle and high school students in Columbiana County, Ohio. Her work and research focus on college access, writing center practice, and composition pedagogy. Jill Reglin, M.A., has held her position as Coordinator of the Writing Center at Lansing Community College Lansing, Michigan) since 1997. As a full-time faculty member in the English Department, she also teaches various levels of composition. Reglin has been an active member on the boards of the Michigan Writing Centers Association, the East Central Writing Centers Association and the International Writing Centers Association. She has served as a leader in the IWCA Summer Institute for Writing Center Directors and Professionals four times. Her current research interest (and subject for an upcoming sabbatical) is focused on preparing faculty who have worked in the Writing Center to provide WAC assistance to their peers. Jan Robertson, M.A., has been director of the Bronx Community College Writing Center for fifteen years and the co-trainer of the Bronx Community College CRLA college-wide tutors for eleven years. She has also taught in CUNY for 27 years, teaching English and all levels of ESL, including the CUNY Language Immersion Program. Further, she has taught ESL and Spanish in NYC public and independent schools. She has an MA in English from Columbia University, Teachers College and an MA in Educational Theater from New York University where her focus was designing drama- in- education lessons for ESL students. James Winter, M.F.A., teaches English and coordinates the Learning Center at Kent State University Salem. This is his first academic publication. His fiction and nonfiction have been published or is forthcoming in One Story, Prairie Schooner, Salamander, PANK Magazine, Midwestern Gothic, Prick of the Spindle, and The Rubbertop Review. Sebastian ZÜllner, Ph.D., is Professor of Biostatistics and Professor of Psychiatry at the University of Michigan. He has the equivalent of a Master’s degree in Mathematics and a Ph.D. in Biology from the University of Munich. His primary research is in population genetics and the genetics of psychiatric diseases. He is interested in the parallels between epidemiological and educational research.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017)
FROM THE GUEST EDITOR Genie Giaimo The Ohio State University giaimo.13@osu.edu There is, perhaps, no more popular (or contested) representation of the unique challenges faced by students at Two-Year Colleges than in the television sitcom Community (created by Dan Harmon), which follows the lives of a number of community college students, administrators, and teachers, as they navigate Greendale Community College. Based on Harmon’s personal experience attending community college, Community offers a humorous yet critical lens into the lives of students who attend community colleges including their motivations, their life experiences, and their diversity. Yet, there are a number of fallacies built into Community’s depiction of these students’ experiences at Greendale.1 Namely, there is not nearly so much community or engagement in community college as the show suggests. In the show, the main characters meet and become friends while studying for a Spanish class. In place of systematized tutoring (there appears to be no tutoring center at Greendale), these students take it upon themselves to learn the language and guide one another through the fantastical lesson plans of the erratic Señor Ben Chang (Ken Jeong). Although at least one of the group members, Shirley Bennett (Yvette Nicole Brown), has two children, and another, Britta Perry (Gillian MacLaren Jacobs), has an offcampus night job, these characters seem to have a large amount of personal time to get into hijinks, meet at the study table, and bond. The students appear to attend classes all day, and are, for the most part, financially secure. Pierce Hawthorne (Chevy Chase) is a millionaire who at one point supports Annie Edison (Alison Brie) as she struggles to pay rent. Britta depends intermittently on her wealthy parents. Shirley starts her own on-campus business. Abed Nadir (Danny Pudi) and Troy Barnes (Donald Glover) live in a college dorm and don’t have off-campus jobs; Abed’s father even pays his tuition. Even the main character, Jeff Winger (Joel McHale)—who loses his job as a well-paid lawyer due to forging a college degree—and who is, at one point, homeless and living in his car, remains materialistic and predominately financially solvent in the show. Importantly, he manages to pay for course credits, the books he carries, the stylish clothing and accessories he wears, and the expensive smartphone he continuously uses. The students in
Greendale invest in the social and political life of the school. They demonstrate help-seeking and selfefficacy skills from the beginning of their enrollment at the college. The business of daily life infrequently interferes with school. The reality of many community college students is far different. A 2017 Community College Survey of Student Engagement (CCSSE) report shows that work obligations account for a lot of community college students’ time: about half of respondents (n=97,420) work more than 20 hours a week (University of Texas Austin 4). Additionally 50% of respondents (33% who are women and 17% who are male) (n=97,224) report having children living with them (University of Texas Austin 4). TYC students’ commuting time is also incredibly high due to lack of car access and poor public transit infrastructure. At my former institution, Bristol Community College, 67.7% of students commute between 1-5 hours a week (University of Texas Austin 8). TYC students spend more time working off-campus, caring for dependents offcampus, and commuting to and from campus than they do on campus. The time that community college students spend off-campus working, caretaking, and commuting are not the only differences in the fictional representations of Community. In reality, tutoring is also quite different; it is more systematized, less consistently attended, and predicated on a top-down hierarchical model that often merges different types of tutoring in “learning commons” or “academic learning center” models. Perhaps the only point of connection between the show’s fictional community college and a real-world community college is in demonstrating how tutoring (informal or otherwise) can serve as a communal space in which peer tutors, students, faculty tutors, adjunct tutors, instructors, deans, and other administrators all intersect. I open this introduction with a fictional representation of Two-Year Colleges that stands in sharp contrast to this special issue of Praxis, which is the first that is fully devoted to Two-Year College writing center assessment and praxis. The contributing authors come from diverse disciplinary backgrounds and are writing about community colleges and writing centers located all across the United States. One of the
From the Guest Editor • 2 main premises of this issue is that it is imperative to conduct local research and assessment, particularly at the Two-Year College level—single-lens representations (fictional or otherwise) cannot do justice to the wide variety of Two-Year Colleges operating in the United States. Community colleges are idiosyncratic and often develop in relationship to the communities they serve. Writing centers, similarly, are deeply heterogeneous. At Two-Year College writing centers especially, as the articles in this issue suggest, heterogeneity profoundly affects the everyday work of writing centers, from staffing to tutor identities, to tutoring models, to assessment practices. The authors in this issue will take up familiar concerns: writing center efficacy (Geist and Geist; Griffiths-Hickman et al.; Pfrenger et al.); staffing models and needs (Reglin); tutor identities and impact on writing center labor (Bright); the value of peer tutors in WC contexts (Gardner); diversity in the writing center (Robertson); and a RAD assessment of student perceptions of writing centers (Giaimo). Although the topics in this issue are applicable to most writing center work, Megan Baptista Geist and Joshua Geist make the crucial point that “Writing centers— and perhaps especially writing centers at two-year colleges—are not interchangeable” (9). Many of these articles draw heavily upon their localized contexts in their assessment and research, providing a TYC “twist” on some fairly well-covered topics. For example, Jill Reglin identifies a threat to writing center work in the form of administrative decisions that outsource tutoring to for-profit companies, a move that is popular at many TYCs and that further complicates staffing and training models. Alison Bright’s piece on cultivating professional writing tutor identities indicates the need for more research about training for professional tutors, as it is common practice to employ a “wide variety of tutors,” with varying educational experiences at TYCs (1). Pfrenger et al. analyze the effect that compulsory WC attendance has on students in remedial writing courses at Kent State Columbiana, where 81% of students need some kind of remediation (1-2). And, although the argument for employing peer tutors has been all but been put to bed as general “best practice” for writing center staffing models, Clint Gardner’s piece leads the call for hiring peer tutors in TYC WCs, despite hurdles like recruitment, training, and retention. The shared concerns of four-year-college writing centers and TYC writing centers suggest that there is a vital connection between these two kinds of institutions, yet there is not enough collaboration or cross-talk between two- and four-year colleges, particularly their writing centers. At the same time,
there exists within two-year college writing centers— and perhaps in writing centers more generally—a uniqueness factor that is often overlooked in scholarship. The authors in this issue identify varying factors, institutionally and demographically speaking, that uniquely affect TYC writing centers. Taken as a corpus, these articles are a response to the dire need for TYC writing center administrators to conduct local and replicable programmatic assessment. Uniqueness, however, is not synonymous with inapplicability to other settings. Researchers can design studies that take into account local culture and demographics while developing methods that can be replicated in other institutional contexts. Applicability hinges on replicability, as two articles (Giaimo; GriffithsHickman et al.) in this issue demonstrate. Replicable methods are critical to successful writing center assessment. Yet, as a field, we rely heavily on research and praxis that comes out of four-year writing centers and that is only intermittently replicable. Additionally, research in four-year writing centers often has vastly different staffing models, training models, tutoring models, funding models, etc. While more cross talk between institutional types ought to occur, there needs to be institutionally specific replicable research and assessment coming out of two-year college contexts. TYC writing centers provide community where there is little and research where there might once have been only past practice. The dynamic between tutor and client might be a far cry from the camaraderie shared by Harmon’s cast of lovable yet functional misfits, but writing centers are effective in fostering a community of support, which contributes to positive student learning outcomes, as all of the articles in this issue note. TYC writing centers and their research can be pivotal in effecting positive change in the institution and in learning outcomes; however, “home grown” assessment is critical in determining what effect, if any, our centers are having on writers and the broader institution. To conclude, this issue could not have gone forward without the tireless efforts of the editorial team, Jamie, Alejandro, and Sarah, as well as the team of reviewers for the submissions. Thank you to all who have worked to bring this issue to publication; it was truly a team effort.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
From the Guest Editor • 3 !
Notes 1. See Moltz for further details. Works Cited Community. Writ. Dan Harmon, Chris McKenna, and Ryan Ridley. Dir. Anthony and Joe Russo. NBC, 2009. DVD. Moltz, David. “Colleges Review 'Community'.” Inside Higher Ed, Inside Higher Ed Inc., 24 Aug. 2009, www.insidehighered.com/news/2009/08/24/com munity. University of Texas at Austin. Department of Educational Administration. “Institutional Report.” Community College Survey of Student Engagement (CCSSE). Austin TX. 2011. University of Texas at Austin. Department of Educational Administration. “Making Ends Meet.” Community College Survey of Student Engagement (CCSSE). Austin TX. 2017, http://www.ccsse.org/docs/Making_Ends_Meet. pdf.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
Praxis: A Writing Center Journal • Vol 15, No 1 (2017)
DRAFTING A WRITING CENTER: DEFINING OURSELVES THROUGH OUTCOMES AND ASSESSMENT Joshua Geist College of the Sequoias joshuag@cos.edu Writing centers—and perhaps especially writing centers at two-year colleges—are not interchangeable. While we, as writing center professionals, may share philosophies and goals, our writing centers often differ wildly in structure, practice, available services, institutional relationships, and other factors. Those of us organizing these centers have different titles, qualifications, positions, and levels of security. What we have in common—what we have always had in common—is a commitment to supporting student writers, and the perpetual labor of establishing and articulating our space within our various institutions. As Outcomes Assessment has become increasingly important to accrediting bodies, and thus to institutions and administrations, the assessment process has become part of the complex negotiations of position for many writing centers. At College of the Sequoias, we felt invested in assessment as a valuable tool. But when, in 2016, we were designated as a separate unit in our Institutional Program Review process, we discovered that reexamining our assessment practices offered an unexpected opportunity to more clearly define our writing center and declare its purpose—not only to the institution, but to ourselves.
The Problem of Hyperfocused Assessment Many of us long-time writing center true believers have grown up on Stephen North’s old axiom: “Our job is to produce better writers, not better writing” (“Idea” 438). While North has gone on to reconsider and qualify some of what he wrote in “The Idea of a Writing Center,” we can set aside the particulars and see ourselves in his purpose: declaring, unequivocally, the process-oriented essence of writing center work (“Revisiting”). Whatever the differences in our institutional contexts, our practices, or even our pedagogies, we are united in our fundamental goal: helping student writers improve their skills. Because that goal is fundamental, it often comes to define the identity of a writing center. Our own website defines the CoS Writing Center as “a place for writers of all levels . . . to talk to experienced readers about what’s happening in their writing, and to learn new strategies and techniques to help them overcome
Megan Baptista Geist College of the Sequoias megang@cos.edu the obstacles that lie in their path” (“Our Philosophy”). Jackie Grutsch McKinney refers to this as the “writing center grand narrative,” and observes that while there is truth in that narrative, it is also restrictive (“Introduction”). Rather than “one of many possible representations” of the broad scope of writing center work, that familiar narrative becomes, in our minds, ‘just what we do’” (McKinney, “Introduction”). A focus on tutoring is good, as tutoring is understandably central to the image and self-image of the writing center. But what McKinney describes—and what we at the College of the Sequoias Writing Center found ourselves guilty of—is hyperfocus: a focus on tutoring so intense that it obscures our sight of the wide range of other work that we do. Moreover, McKinney argues, part of the power of that narrative is that tutoring is an eminently quantifiable practice. “It is easy,” she notes, “if we are counting tutoring sessions, to compare one year to the next and to have quantitative data that speaks to our efficiencies and progress” (McKinney, “Introduction”). Because tutoring is, in many ways, easy to quantify— numbers of sessions, student contact hours, frequency of issues, and the like—it can easily take pride of place in our assessment work. McKinney notes that “[a] focus on tutoring allows for the development of student learning outcomes that feed into assessment protocols.” In that observation, McKinney’s summary reveals a fundamental contradiction: what we do in writing centers extends far beyond tutoring, but what we assess is often limited to the tutoring table. At College of the Sequoias, our hyperfocus on tutoring manifested in a long-standing practice of regular analysis of our clients’ success. Working with our college’s Office of Research and Planning, we analyzed data about their classroom performance (Do they stay in the courses they’re taking? Do they succeed in those courses? Do they earn good grades overall?) in our annual reviews of our work. These reviews have always shown a strongly positive correlation between using the writing center and desired outcomes; our student users tend to complete, succeed in, and earn higher grades in their classes than students who don’t use our services. We held up that research as evidence that what we offer is effective and deserves continued
Drafting a Writing Center • 5 and expanded institutional support. Any questions about our work from the administration were answered by that data. As Schendel and Macauley argue, however, assessment is not merely about justifying our existence to administration. Rather, “it can help us understand much more tangibly the work that we do, what works best, why it works, and how we can make that work accessible to as many as possible” (Schendel and Macauley xvii). Moreover, it is a way of communicating ourselves to stakeholders beyond our walls; through assessment, “we articulate how our work fits within the institution and within the field of writing studies— within the whole idea of empowering people through writing and education” (Schendel and Macauley xviii). At the heart of assessment is a simple but vast question: Are we doing the work that we set out to do? To answer that question, we must ask it not only of what happens at the tutoring table, but of the full wild, wiggly range of inarticulable labor that occupies the periphery of the writing center. This question was put to us in earnest when, in 2016, the College of the Sequoias Institutional Program Review Committee designated our writing center as one of the first “hybrid” units at CoS. In so doing, the college acknowledged that we are part of two different campus areas: academics and student support. We are thus obliged to assess not only what our students are learning (in the form of Student Learning Outcomes at various levels), but also the ways in which we serve the college (in the form of Service Area Outcomes). To begin our first Program Review, we needed to design Outcomes that represented the full range of our work and construct plans for assessing those Outcomes. This bureaucratic necessity became, for us, transformative, as we realized that this new assessment task allowed us to articulate our work more broadly. As we worked toward drafting our Outcomes, we were in fact drafting a new, more complete identity for our writing center.
Outward-Facing Outcomes At its core, assessment is about relationships and responsibilities. The first step in our overhaul was asking ourselves two questions. First, what were our responsibilities? And second, to whom were we responsible? Our focus had long been on our relationship with the students who use our writing center; thus, we had looked primarily and extensively at their success as the core of our assessment. However, the relationship we have with students and the effect we may have on their coursework is only one of our relationships within the college, and only
one facet of our complex structure. In fixating on that relationship, we have long neglected to engage with questions about our other constituencies, including our administration and staff, our faculty, and our student tutors. What were our obligations to those constituencies? How could we meaningfully measure our success in meeting those obligations? When we began the process of drafting our Outcomes for Program Review, we didn’t know. One of the foundational tenets of assessment pedagogy is that meaningful assessment is recursive in nature, and so it bears pointing out here that we’re thinking of all of our work—our Outcomes, our Assessment Plans, even our Purpose Statement—as perpetually subject to revision. As is the case for so many writers, the hardest part of this first draft was pulling together all of the pieces in the first place. When this project began, we had one hastily-written Outcome for our student users, three slightly less hastily-written Outcomes for each of our four student tutor training courses, and three cobbled-together-bycommittee Program Outcomes for our Writing Consultancy Certificate. For the purposes of our Program Review process, we needed to have a Purpose Statement, Service Area Outcomes, and coherent Program and Course Outcomes with preliminary assessment plans that would encourage us to engage in ongoing, robust self-assessment. While the details of these requirements may be specific to College of the Sequoias, they revealed five levels of inquiry that any writing center could—and perhaps should—use to define itself: • What is our central purpose? • What are our responsibilities to the college at large? • What are our responsibilities to our broader academic community? • What kind of training are we offering to our tutors? • What kind of service and support are we offering to student writers? Our specific answers to each of those questions are, we think, less generally useful than the questions themselves. In our discussion below, we will focus on the steps we took in exploring those questions. The appendix to this article, however, includes the artifacts and Outcomes we generated at each level, as well as our plans for assessing them.
Rewriting the Writing Center Defining Ourselves: Purpose Statement At the outset of this project, we had a number of disparate paragraphs that purported to describe our
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Drafting a Writing Center • 6 function, housed in various campus receptacles and on sundry hard drives. Many of the writers of those descriptions had long-since moved on to other campus projects. Some paragraphs were too specific, some were too vague, and others were simply clunky. We wanted our purpose statement to be readable, accurate, and reflective of the fact that, at our core, we exist to provide peer support for students and student writing. Thus, we focused our purpose statement on this relationship. Whatever else we are, whomever else we serve, we always come back to our mission to provide high-quality support for student writing projects. Supporting Our Institution: Service Area Outcomes As a newly-minted hybrid unit, the CoS Writing Center needed to create Service Area Outcomes, which CoS Student Support Service Areas use to define their commitments to their constituents. But one of our first difficulties came from trying to identify what, exactly, a Service Area Outcome was. We scoured campus documents, asked colleagues, and poked around other institutions for guidance; at every turn, we felt we had less certainty, not more. Finally, the Dean of Student Services told us something that was simultaneously frustrating and enlightening: Service Area Outcomes are whatever you need them to be. For our writing center, then, Service Area Outcomes are a way to define the commitments we have that are not academic in nature. Student awareness of our services, for example, is not something we can measure as part of a class or program. It isn’t contained to a classroom, or even to a single department. However, it has long been a question in our minds (and at our staff meetings): Do students know we exist? How do they find out about the services we provide? We have partnered with our college’s Office of Research and Planning to add questions about writing center awareness to a large biennial survey of our student population; we will use students’ responses to that survey to shape our future student outreach efforts. We have long known that the bulk of our student users come to the writing center to work on writing assignments for English courses, primarily those in our composition sequence. We have long identified the need for better relationships with potential referring bodies—the History department or the Student Success Program, for example—but our previous assessment structures did not have a space for us to articulate goals not directly connected to student learning. Thus, our outcome regarding Faculty & Staff Relations was born. Numbering this among our outcomes keeps us mindful of and accountable to it as
a goal. It is our hope that we will see an increased diversity of referrals. Our Tutor Professionalization Outcome is, on one level, part of a larger institutional effort to increase the number of certificates and degrees awarded. But on a more immediate level, it is also an effort to improve the quality of our tutoring and to increase professional opportunities for our tutors. We hope that our commitment to tutor professionalization will help us argue for increased tutor compensation in the future. Finally, we wanted to institutionalize the idea that our services would be offered in ways that maximize efficiency and economy. Like all writing centers, we are limited by our budget and the size of our staff. Perhaps this seems obvious from the outside; after all, isn’t budget efficiency a concern at every institution of higher learning? The reality, though, is that we are often asked to offer tutoring in places and at times that our own research indicates is not an effective use of our funds. Our Student Demand Outcome provides us with a formal, institutionally supported reason to base staffing decisions on data, even when facing pressure from individual administrators with their own needs and agenda. Supporting Our Community: Program Level Outcomes Our Certificate in Writing Consultancy has been among our college’s program offerings for the last five years. Here, our work brushes against other, larger bodies: our Certificate includes courses from a variety of other campus departments; bears the approval of our campus curriculum committee, our academic senate, and our board of trustees; and is registered with the California Community College Chancellor’s Office. Therefore, our revision process had to consider existing relationships, responsibilities, and practices of outside disciplines and governing bodies. In revising the Program Outcomes, we asked ourselves: What do we want students who complete this certificate to have learned? And do the courses required for certification give them the opportunity to learn those things? We found that many of the courses included in the certificate encourage students to experience composition in diverse ways, including introductory creative writing, journalism, linguistics, and literature classes. This broad experience will prepare our tutors to support writers working on an equally broad range of writing tasks. Our first Program Outcome describes this experience. The remaining Outcomes are more focused on what students learn in up to 8 units of writing center training courses. We recognize that many of our tutors are aspiring educators, and we hope to number them among our colleagues in the future. In that spirit, it’s in
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Drafting a Writing Center • 7 our best interest to give them fertile soil in which to grow their own pedagogies. We have therefore focused on two discrete questions: Can tutors work with students on higher- and lower-order concerns in their writing? And can tutors express how pedagogy undergirds their practices in the writing center? These questions speak to students’ learning across the full range of their experience at College of the Sequoias. In order to assess tutors at this level, we plan to collect culminating portfolios from all students earning the Certificate in Writing Consultancy. These portfolios will allow students to choose the work that best represents their experiences in the Writing Consultancy program. Supporting Our Tutors: Course Outcomes for Tutor Training While our Certificate Program is, of course, concerned with what our tutors are learning, its focus is on the skills our tutors carry with them into the community after they graduate. But our tutors are not simply products to be shipped off to other roles and institutions. So, in addition to using tutor learning as a way of looking at our obligation to our community, we should also look at our obligation to those tutors. How are we training, teaching, and supporting our tutors while they are with us? Our Course Outcomes for our four tutor training classes offered us a way to answer this question. Those four classes are intended to help students build a library of tutoring techniques, to prepare them for the varied challenges of writing center tutoring, and to lead them through a sequential process of deeper study and inquiry into writing center—and ultimately composition studies—theory and practice. As we revised these Outcomes, we found an opportunity to express precisely what we want our tutors to be learning. As we rethought our approach to assessing those Outcomes, we found an opportunity to ensure that our tutors were, in fact, learning those things. Tutor training takes many different shapes at different institutions, with varying degrees of formality. But we should hold ourselves accountable to articulate our expectations for what tutors should learn, and to evaluate how well we are teaching and supporting them, especially in those cases where tutor training is not formally linked to a course assessment process.
would look like and how those improvements should come about. After all, the quickest way to improve student writing would be, in many cases, to write it for them; this, however, does not serve our purpose, nor does it align with our college’s code of student conduct. After discussion, we landed on the following: “Students will improve their writing ability by drafting, revising, and polishing with a tutor.” Isn’t this what we do, week in and week out? Here we returned to our long-standing, foundational question of assessment: are our clients succeeding? This question—the object of our hyperfocus—is and always ought to be central to our work in the writing center, and we should never let it go.
Working Forward As Outcomes Assessment becomes an increasingly pressing reality, and as it is deeply ingrained into institutional practices of resource allocation, it is easy to look at assessment as a bargaining tool. It is easy to let assessment create a sense of division between the writing center and the institution. We feel we must offer up our assessment results to the institution, or use them to defend ourselves against our institution, or wheedle resources from our institution. But in truth, as we work to articulate ourselves across the many levels of our work, we make ourselves more deeply and meaningfully a part of our institutions. As Schendel and Macauley say, “it is through this reciprocal relationship of articulating our values in a way others value, and reshaping our values to reflect others’ values, that we can change the assessment dynamic from ‘us’ versus ‘them’ or ‘us’ and ‘them’ to a coherent, collegial, inclusive ‘us’” (85). Rewriting ourselves has been an arduous process, but as in so many cases, its ending is, in truth, a beginning. What we do from day to day has changed very little, but the way we look at that work has expanded vastly. We continue to engage in that wild, wiggly range of inarticulable labor that is the work of a writing center—only now, we have articulated it. Having mapped out our commitments, we can see them clearly and begin the work of exploring how and how well we meet those commitments.
Supporting Our Students: Course Outcomes for Student Writers Somewhat fittingly, we finished our drafting work almost exactly where we began: discussing what happens between student writers and their tutors inside the writing center. Our first pass at this was almost crass in its simplicity: “Students will improve their writing.” However, we decided to flesh out what this Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Drafting a Writing Center • 8 Works Cited McKinney, Jackie Grutsch. Peripheral Visions for Writing Centers. Kindle ed., Utah State University Press, 2013. North, Stephen M. “The Idea of a Writing Center.” College English, vol. 46, no. 5, 1984, pp. 433-446. ---. “Revisiting ‘The Idea of a Writing Center.’” The Writing Center Journal, vol. 15, no. 1, 1994, pp. 7–19. “Our Philosophy: What We Stand For.” College of the Sequoias Writing Center, 12 Feb. 2015, www.cos.edu/Library/WritingCenter/Pages/OurPhilosophy.aspx. Accessed 6 Jan 2017. Schendel, Ellen, and William J. Macauley, Jr. Building Writing Center Assessments that Matter. Utah State University Press, 2012.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Drafting a Writing Center • 9
Appendix Assessment Overview – College of the Sequoias Writing Center
!
Drafting a Writing Center • 10 Assessment Overview – College of the Sequoias Writing Center
!
Drafting a Writing Center • 11 Assessment Overview – College of the Sequoias Writing Center
!
!
Praxis: A Writing Center Journal • Vol 15, No 1 (2017)
CULTIVATING PROFESSIONAL WRITING TUTOR IDENTITIES AT A TWO-YEAR COLLEGE Alison Bright University of California, Davis asbright@ucdavis.edu
Professional Development of Professional Tutors at Two-Year Colleges Since their inception, writing centers at two-year colleges have had to be creative in their methods of maintaining a staff of tutors who can meet the writing support needs of their student writers. As early as 1981, Gary Olson noted, “[s]taffing the center is perhaps the most difficult problem two-year colleges encounter” (21). In contrast to writing centers at fouryear institutions, which traditionally rely on peer tutors, the trend at two-year colleges, as noted by Leslie Roberts in 2008, has been to employ a wide variety of tutors, such as English instructors, professional tutors, and volunteers, in addition to peer tutors. Because writing tutors at two-year colleges come from a wide variety of backgrounds, they similarly bring a wide variety of experiences to their work. And while these varied backgrounds and experiences can potentially enrich the tutorials of student writers at two-year colleges, they can also potentially result in a disconnect between the writing tutors’ expectations for the tutorial and the best practices in the field. For example, English instructors employed as tutors by two-year colleges may over rely on their instructor behaviors during a tutorial, which are not always conducive to developing the non-evaluative environment that is accepted as a best practice in writing center theory.1 Thus, a tutorial could end up looking and sounding a lot like a visit to office hours, rather than a writing consultation. Similarly, institutions that employ peers may struggle to maintain a strong pool of tutors, since tutors may privilege their responsibilities as students and transfer in two years (an occurrence noted as early as 1983 by Thomas Franke). Other two-year colleges employ professional tutors, and while these tutors have long played an integral role in writing centers, little research has been conducted on examining the unique professional development needs of professional tutors for work in two-year writing centers. As part of a research project studying whether writing center tutor training programs could foster a “tutor identity” in their participants,2 akin to what K12 teacher educators have found in their research into
“teacher identity,” I observed the training program for professional tutors at a large two-year college in Southern California. Teacher education researchers Janet Alsup and Deborah Britzman determined that the inclusion of a “teacher identity” focus in teacher preparation programs produced pre-service teachers with a strong fit and understanding of the profession. In my research, I similarly found that the observed writing tutor training program prepared participants with a strong understanding of what it means to be a professional writing tutor at this two-year college. The program gave new tutors the opportunity to create a tutor identity that was distinct from their other professional or academic identities. The program’s focus on developing a tutor identity prevented the new professional tutors from depending upon past models from other professional experiences that would not have been appropriate in writing tutorials. The observed tutorials of these tutors displayed evidence that the professional tutors similarly had a strong fit and understanding of the professional tutor identity as constructed by the local writing center. That is, the tutoring behaviors in the tutors’ observed tutorials displayed evidence of the student-focused writing center philosophy promoted in the training program.
A Preparation Program Focused on Identity As a former writing center tutor, consultant, and director, I’ve noticed how important it is for preparation programs to include explicit content about professional identity development, so that tutors do not rely on acting like tutors in their tutorials, but instead become tutors.3 But, as anyone who has developed a tutor preparation program can tell you, it can be difficult to address both your center’s theoretical approach to writing tutorials and pertinent practical concerns in the curriculum of your preparation program. This is especially true if your program is condensed to a one- or two-day workshop, instead of a preparation course. So, how can writing center directors at two-year colleges quickly facilitate the development of their professional tutors’ “tutor identities” in a way that distinguishes them from other professional identities (particularly from that of teacher
Cultivating Professional Writing Tutor Identities… • 13 identities)? Based on my observations of the two-year college writing center examined in my study and a review of relevant literature on identity development, I suggest that writing center directors at two-year colleges consider developing training programs that include a significant exposure to models of professional tutors4 and a strong inclusion of the discourse of the writing center discourse community.5
Key Elements of Observed Preparation Program The training program for the professional tutors that I observed at this two-year college began with the hiring process, where prospective tutors take a halfhour “skills assessment test.” They are then debriefed about the test in a later interview between the tutor candidates and one of the two writing center directors. The test examines the applicants’ abilities to isolate and explain grammatical errors, to respond to a section of student writing, to reflect on the difference between teaching and tutoring, and to complete a brief writing sample. After the tutors are officially hired, they are required to attend a campus-wide, three-hour workshop for tutors in all disciplines. All of the writing tutors are also required to attend a one-hour “welcome back” workshop in the writing center held in the first few weeks of the fall semester; this is typically where new and returning tutors meet for the first time. At this meeting, new writing tutors are given a locally produced tutor handbook, which includes both theoretical approaches to tutoring and practical concerns related to working in the local center; newly hired prospective tutors are asked to read and annotate the handbook before their first shifts in the writing center. For their first few shifts, new writing tutors do not actually tutor student writers. Instead, they are introduced to the staff and the space of the writing center, and debriefed on their responses to the handbook by an available writing center director. They also observe tutorials performed by experienced tutors. After they conduct one or two tutorials independently, the new tutors are observed tutoring by one of the writing center directors, who later debrief tutors on their performance. The writing center directors also hold a two-hour mid-semester workshop and potluck, where tutors check in with one another and discuss relevant issues they have noted through their observations or their self-reflections, which are completed after every session. At the end of the semester, each tutor completes an additional, longer self-reflection, and is again observed by a writing center director. The
professional tutors are also presented with additional, optional modes of preparation: a one-unit course instructed by either one of the two writing center directors and monthly “brown bags” or informal colloquia focused on issues in writing studies, which are open to the entire campus community.
Observed Outcomes of Preparation Program Due in large part to the significant exposure to models of professional tutors and the required inclusion of writing center discourse (in the form of the tutor handbook), the four professional tutors I observed in this research project developed professional tutor identities that were strongly aligned with the desired outcomes of the preparation program (as outlined by the two writing center directors). And because the professional tutors observed such a wide variety of tutor models, they understood that the preparation program did not promote a singular or static notion of tutor identity. Instead, the program encourages the new professional tutors to consider who they are as tutors, in addition to who they are as teachers, teaching assistants, writers, etc. This was due in part to the fact that, like many other professional tutors, the four professional tutors profiled in my research held at least one other professional position during the time data were collected. Perhaps, because the tutors already possessed fully developed professional identities, they were able to navigate the successful development of tutor identities using preestablished characteristics of their other identities. Even though all four of the professional tutors in this case had exposure to potentially conflicting identity models, such as that of teacher or editor, they were able to resist employing these other professional identities, which would not have been appropriate for the writing center’s context. The age and professional experience of these tutors indicated that they had been exposed to multiple identity models and had more experience constructing institutional identities. This suggests that writing center directors at two-year colleges may benefit from encouraging participants to consider the identity characteristics of past professional identities and how they align or conflict with the desired identity characteristics of the writing tutors at their own centers.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Cultivating Professional Writing Tutor Identities… • 14 Notes 1. See Murphy and Hobson for descriptions of these practices. 2. The focus on “tutor identity” in this project was based on the theoretical frame that K-12 teacher educators have found in their research into “teacher identity.” For examples, see Alsup, Danielewicz, and McKinney et al. for more information about teacher identity. 3. See Bright for a discussion of this behavior in undergraduate writing tutors. 4. See Wortham for a discussion of the use of identity models in identity construction. 5. See Benwell and Stokoe for an analysis of the use of discourse in identity construction.
Roberts, Leslie. "An Analysis of the National TYCA Research Initiative Survey Section IV: Writing Across the Curriculum and Writing Centers in Two-Year College English Programs." Teaching English in the Two-Year College, vol. 36, no. 2, 2008, pp. 138-52. Wortham, Stanton. Learning Identity. Cambridge University Press, 2006.
Works Cited Alsup, Janet. Teacher Identity Discourses: Negotiating Personal and Professional Spaces. Lawrence Erlbaum Associates, 2006. Benwell, Bethan, and Elizabeth Stokoe. Discourse and Identity. Edinburgh University Press, 2006. Britzman, Deborah P. Practice makes Practice: A Critical Study of Learning to Teach. Albany: State University of New York Press, 1991. Bright, Alison. “Becoming Peer Tutors of Writing: Identity Development as a Mode of Preparation.” Teaching/Writing: The Journal of Writing Teacher Education, Vol. 2, no. 1, 2013, pp. 22-28. Danielewicz, Jane. Teaching Selves: Identity, Pedagogy, and Teacher Education. State University of New York Press, 2001. Franke, Thomas L. "A Case for Professional Writing Tutors." Teaching English in the Two-Year College, vol. 9, no. 2, 1983, pp. 149-50. Hobson, Eric. “Maintaining Our Balance: Walking the Tightrope of Competing Epistemologies.” The Writing Center Journal, vol. 13, no. 1, 1992, pp. 6575. McKinney, Marilyn, Saralyn Lasely, and Rosemary Holmes-Gull. “The Family Writing Project: Creating Space for Sustaining Teacher Identity.” English Journal 97.5 (2008): 52-57. Murphy, Christina. “The Writing Center and Social Constructionist Theory.” Intersections: Theory-Practice in the Writing Center, edited by Joan A. Mullin and Ray Wallace, National Council of Teachers of English, 1994, pp. 25-38. Olson, Gary A. "Establishing a Writing Center in the Junior or Community College." Community College Review, vol. 9, no. 2, 1981, pp. 19-26. Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Praxis: A Writing Center Journal • Vol 15, No 1 (2017)
CREATIVE STAFFING FOR THE COMMUNITY COLLEGE WRITING CENTER IN AN ERA OF OUTSOURCED EDUCATION Jill Reglin Lansing Community College penninj@lcc.edu Ask anyone who works in a community college writing center to list the challenges they face, and staffing will undoubtedly be among their top concerns. Developing a successful community college writing center means first asking the question, “What kind of assistance do students need?” The question that follows is almost always, “Who should provide this assistance?” Staffing a community college writing center poses a set of problems unique to the two-year higher education environment. The recent popularity of contracted, third-party, for-profit tutoring services contributes significantly to these complexities. Thirdparty tutoring service firms often include a writing lab as part of a package deal with services for other, highdemand support, typically in STEM fields. Attempting to fill gaps in those high-demand areas, college administrators see writing tutoring as an included bonus in the package; but, outsourced tutoring comes with a hefty price tag for the subscription, as well as administrative costs associated with implementation, advertising the service to students, and teaching both faculty and students how to access and use it. More significantly, the practices these companies employ are often at odds with the pedagogical standards embraced by the writing center professional community. I have seen enough tutoring demonstrations, chat transcripts, marked-up papers, and emailed feedback provided to students by for-profit education companies to say that the type of feedback they offer appears to run counter to what we teach our writing center staff members to provide. The tone of the feedback is often that which would typically come from an instructor, rather than a writing assistant. The suggestions for revision are often generic and/or advise students to follow “rules” for writing that not all writing teachers would embrace. The comments provided often go overboard in terms of wordsmithing—if not outright editing—so much so that a colleague in my own institution suggested we would need to revise our plagiarism policy to accommodate the “feedback” students might receive from such services. Perhaps most concerning, and most antithetical to the work of a writing center, is the absence of dialogue about both writing process and product. It appears that several of these services allow
students to simply submit a paper without a written assignment prompt and without identifying specific topics on which they would like to receive feedback. This is akin to dropping off a paper at the front counter in a writing center and picking it up later. As writing center professionals, we would be naïve if we did not take these companies seriously. Their flashy sales pitches, promises for 24/7 on-demand tutoring, and growing popularity with college administrators make them our real competitors. We must position ourselves to offer thoughtful, professional input on the quality of these services and explain clearly the pedagogical nature of our concerns. The asynchronous, mostly “canned” and prescriptive feedback, and the absence of conversation about both the writing and the writer effectively strip away the pedagogical function of a writing center. The work of educating the campus community about writing center pedagogy can be tiresome, but it never ends. Neither does the work of designing a cost-effective yet flexible and high-impact staffing model based on best practices. If the answer to “Who should assist our students with their writing” is not a third-party, contracted tutoring service, then what is it? As a writing center administrator with twenty years of experience in the two-year college setting, I have heard a long list of responses to this question: 1. Peer tutors, and nobody else! (This was my own response not too long ago.) 2. Certainly not peer tutors. They are only a year or two into their college studies and therefore not qualified. We don’t even have English majors here. 3. Peer tutors; they’re inexpensive to employ so it isn’t as wasteful if we pay them to do homework while waiting for students to visit. 4. Professional staff. Tutors need to have a degree, credential or certification of some sort. 5. Professional staff. Someone with a bachelor’s degree can tutor in most any subject area.
Creative Staffing for the Community College Writing Center… • 16 6. Faculty! They should be afforded opportunities to work with students in meaningful ways outside of a traditional classroom setting. 7. Faculty are the most qualified to do this work. 8. Faculty should do this work as part of their workload. (Translation: We don’t really need a budget for a writing center staff.) 9. And finally: Students have access to faculty during office hours. This is all the help they need. Though some of these responses are more problematic than others, all are fraught with at least some misconceptions about who is qualified to support student writers. But why should staffing the community college writing center force us to choose student peer assistants or professional staff or faculty? Why can’t we include them all? The short answer is: We can. The longer answer is: We should strongly consider a multi-tiered, blended staffing model. Peer tutoring is a well-accepted practice with a long history of success. Well-educated peer tutors are qualified and capable of assisting their peers with both writing process and product. They are qualified in ways others are not because they share a social status or space with other students—regardless of level of study or age difference. The power of equal footing cannot be mimicked or taken for granted. This point is well documented by Ken Bruffee, Andrea Lunsford, Muriel Harris, Peter Carino, Stephen North, John Trimbur, and Brian Fallon. It also appears in recent textbooks like the Bedford Guide for Writing Tutors (Ryan and Zimmerelli) and the Oxford Guide for Writing Tutors: Practice and Research (Ianetta and Fitzgerald). Peer tutoring is also strongly supported in the “IWCA Position Statement on Two-Year College Writing Centers.” Employing students as peer tutors affords them an incredible work experience and leadership opportunity that looks impressive on a resume and forms a crucial part of their education. Findings from The Peer Writing Tutor Alumni Research Project indicate that skills learned during writing center work are broadly applicable across a wide variety of fields and occupations (Kail, Gillespie, and Hughes). Many of the peer writing assistants we hire have never held a job in a professional workplace before, much less interviewed with a panel of professionals. Those who supervise student employees recognize that they are students first and employees second. According to 2012 CCSSE data, 19% of full-time students work more than 30 hours per week, and 29% of full-time students care for
! dependents 11 or more hours per week (“A Matter of Degrees”). A 2017 CCSSE report states, “A student who always considers him or herself a part-time student might identify as a worker who goes to school and is likely to see college as one of multiple competing demands” (“Even One Semester”). Student employment accommodates student schedules, helps promote better work-life balance, provides an enriched connection to campus life, and offers meaningful work experience within a professional but nurturing environment. Student peer tutors are more likely to represent the great spectrum of diversity seen across the student population. Our center at Lansing Community College has hired students with a wide variety of linguistic backgrounds, ethnicities, and countries of origin. Our somewhat small staff currently varies in age from sixteen to sixty. In the twenty years I have been hiring and training peer assistants, I can count exactly two who ultimately did not succeed in their jobs. The drawback of relying entirely on peer tutors is their longevity as employees, though a January 2017 WCenter discussion thread initiated by Clint Gardner indicates that turnover at four-year colleges and universities that rely on peer tutors might not be dramatically different when compared to turnover in two-year colleges. Nevertheless, we normally counted on losing 50-60% of our staff each year when we hired only student employees to provide writing assistance. Some community colleges, such as the Community College of Rhode Island and Glendale Community College, have been successful in requiring completion of a for-credit training course to be taken by students prior to working in the writing center. Though prior completion of a training course results in a welleducated, qualified student staff, the trade-off is a delay in hiring if course completion is required before applying for a job in the writing center. This delay can shorten the duration of availability for employment, as well. A student employee who completes the writing center training course in their second semester might only be available to work for one or two semesters before graduating or transferring. The alternate option of hiring peer tutors and providing paid training is perhaps more attractive; but, it can get expensive if hiring takes place every semester. I would argue, however, that peer tutors are still a bargain for the institution. Paying a peer tutor at a rate of $10 per hour for a 30-hour training program costs a whopping $300 per student. Even with indirect costs and the paid time of an administrator to offer the training, the total amount is far from staggering. Professional tutoring staff (those whose employment is not dependent on being students) add stability and
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
Creative Staffing for the Community College Writing Center… • 17 longevity to a writing center. If job descriptions are written well, at least some of these staff positions can be made available for student employees to advance into after they have completed a certain number of credits, or after they transfer to a nearby university to continue their studies. Professional tutor positions require at least some educational credentials (such as the completion of one year of full-time coursework or even an associate’s degree) and prior experience with tutoring. Professional tutors should have job descriptions that require a different level of responsibility than student peer tutors, as well. It is simply unethical—if not a violation of contract within unionized institutions—to pay people different rates for performing the same work. Professional staff should do more heavy lifting, perhaps by handling online appointments, supporting classroom work, engaging in ongoing weekly appointments with students who need more comprehensive support, leading discussions at staff meetings, and mentoring both student employees and newer professional staff. Careful student intake can help to determine what level of support students need and how this level might change as they advance in their studies. The mistake we often make with hiring professional staff is assuming that they don’t need training. Unless they have worked in the very writing center that is hiring them or provided writing assistance in another higher ed environment that subscribes to a writing center philosophy, chances are they do. A degree cannot take the place of exposure to a solid foundation in writing center theory and practice. At the very least, newly hired professional tutors should be provided a set of required readings and an opportunity to discuss them in some forum with other staff early on in their employment. A good balance of professional tutors with peer tutors is also important. What constitutes a good balance varies greatly from one institution to another, but I would issue caution in adopting a staffing plan that is lopsided in favor of professional tutors. Though these more highlycredentialed staff can meet the needs of certain populations of students, the great majority of our students’ needs can and should be met by peer tutors. Peer-to-peer learning situations have great potential to destigmatize “tutoring” and promote a non-directive pedagogy often embraced naturally by students working with other students. The decision to include faculty as tutors within a writing center is indeed a contentious one. My argument for many years in favor of a “peer tutors only” model reasoned that a writing center should offer students help that was substantially different from what they already had access to during faculty office hours.
! Overcoming my own reluctance to experiment with faculty writing assistance required me to set aside a fear that the writing center would become a facultydominated space. I worried that the presence of faculty in the writing center might intimidate both students and staff. I was anxious that a budget-chopping administrator might decide that the writing center could be staffed entirely by faculty fulfilling non-instructional workload hours. However, if interested and well-qualified faculty are chosen with care and are open to being trained in writing center pedagogy, they can become significant assets in two-year college settings. Faculty work in the writing center can build a foundation for a powerful grassroots WAC program within the institution. Howard Tinberg’s research on collaborative reflection among community college peer tutors and faculty documents the power of bridging the unnecessary divide between staff and professors as they work to support student writing. Having faculty on hand who are content experts in things like molecular biotechnology, religion, art, paralegal study, fire science and mental health nursing has broadened the base of content expertise within our writing center. Their participation as writing assistants has opened our eyes to various styles of writing and documentation. The faculty members have become writing center advocates within their own disciplines and have learned invaluable lessons of their own about teaching writing within those disciplines. A number of our faculty have come to recognize that they were unintentionally editing students’ papers. They have gained insight into how students interpret assignments, causing them to make important revisions to their own assignments. Most have gained a tremendous amount of respect for the complex work undertaken by our student employees and professional staff. We include only six faculty writing assistants on the writing center’s staff per semester and schedule them to work with students two hours per week, thereby avoiding the instantiation of a faculty-dominant writing center. Faculty carry the title of Writing Assistant, like all other staff, and students can request to work with them by first name, like they can with any other staff member. Students generally do not know (and do not care) that they are working with a writing assistant who is a faculty member. Faculty are genuinely interested in the training they receive, and several attend our writing center staff meetings, even though this not an expectation. A multi-level, blended staffing model might not be a good fit within all community college writing center contexts. However, creating a space on campus where student employees, staff, and faculty work alongside each other toward the common goal of assisting
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
Creative Staffing for the Community College Writing Center… • 18 students with their writing based on writing center pedagogy is undeniably powerful. Not only is this approach flexible, dynamic, high-impact, and costeffective, but it also creates a visible and recognizable center for writing; the synergy that naturally occurs within it holds promise for changing the very culture of writing in community colleges for the better. Works Cited “Even One Semester: Full-Time Enrollment and Student Success.” Center for Community College Student Engagement, 2017, www.ccsse.org/docs/Even_One_Semester.pdf. Gardner, Clint. “Re: Turnover Rates.” WCenter, 18. Jan. 2017, Texas Tech, lyris.ttu.edu/read/search/results?forum=wcenter& words=turnover&sb=1. Accessed 19 July 2017. Ianetta, Melissa, and Lauren Fitzgerald. The Oxford Guide for Writing Tutors: Practice and Research. Oxford UP, 2016. “IWCA Position Statement on Two-Year College Writing Centers.” International Writing Centers Association, 25 Jan. 2007, writingcenters.org/wpcontent/uploads/2008/06/twoyearpositionstatem ent1.pdf. Kail, Harvey, Paula Gillespie and Bradley Hughes. The Peer Writing Tutor Alumni Research Project, 2017, writing.wisc.edu/pwtarp/. “A Matter of Degrees: Promising Practices for Community College Student Success (A First Look).” Center for Community College Student Engagement, 2012, www.ccsse.org/docs/Matter_of_Degrees.pdf. Ryan, Leigh, and Lisa Zimmerelli. The Bedford Guide for Writing Tutors. 6th ed. Macmillan Learning, 2016. Tinberg, Howard B. Border Talk: Writing and Knowing in the Two-Year College. National Council of Teachers of English, 1997.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Praxis: A Writing Center Journal • Vol 15, No 1 (2017)
WHO WE ARE AND WHY IT MATTERS: THE UNIQUE IDENTITY OF THE BRONX COMMUNITY COLLEGE WRITING CENTER Jan Robertson Bronx Community College janet.robertson@bcc.cuny.edu [Editors’ Note: The video to which this column refers is freely available at praxisuwc.com.] The Bronx Community College Writing Center community is an eclectic mix of races, religions, cultures, and interests. Our tutors and staff are former or current BCC students. They are foreign nationals or have lived or traveled around the world; they speak languages from every corner of the earth—Spanish, French, Hindi, Urdu, Farsi, Twi, English, and Patwa. They are Christian, Jewish, Muslim, Hindu, Atheist, and Agnostic. They are equally academically diverse, with degrees ranging from bachelor’s to master’s and majors in Business, Education, Psychology, Social Work, Physics, Creative Writing, and English. These significant differences might easily divide any other community, yet the BCC Writing Center staff instead celebrates its diversity with a unique, familial, and welcoming camaraderie. The Bronx Community College Writing Center film project Who We Are explores how this is possible through reflections of the BCC Writing Center staff on their work at the writing center and the ways that eclectic mix is addressed with still another question in mind—does diversity matter? The tutors offer reflections of the impact of their differences on each other and their work, as well as their perceptions of the diverse identities of our institution’s demographic.
Background The site of a Revolutionary War battle, Bronx Community College sits on top of one of the highest natural elevations in the city of New York. Architect Stanford White designed the campus in 1890 to house New York University’s uptown undergraduate School of Engineering. Its elegant architecture includes the well-known Hall of Fame for Great Americans and the ornate Gould Memorial Library, which is modeled after the Pantheon in Rome and boasts an 18-karat, gold-leaf ceiling. Amidst this rich historical background, Bronx Community College today serves a population of over 11,000 students from a veritable United Nations of multicultural backgrounds, including the United States,
West Africa, India, Pakistan, Bangladesh, China, Europe, the Caribbean, and South America. Students bring their writing to our writing center from courses across the disciplines. Their writing reflects perspectives formed through the lens of their diverse worldviews and languages, intensified by their complicated lives—balancing full-time jobs, parenting, and attending school. Some express a curiosity and hunger to understand concepts in American culture or to connect newly learned ideas with their pasts. One student asked, “What does ‘student activities’ mean? What are student activities?” A West African student shared how her rural village culture’s beliefs had taught her that witchcraft had caused her twelve-year-old brother’s fatal heart attack. She wrote about her new understanding of coronary artery disease. Our students have stories to tell us through their writing. The Bronx Community College Writing Center tutors must have a passionate interest in understanding what BCC students want to write about. The tutors must have empathy and the ability to listen. They do.
Who We Are The film interview project began with a set of twelve interview questions, generated to elicit responses from nineteen tutors and staff. Each interviewee was filmed separately and asked the same set of questions. They did not have the questions ahead of time, nor did they prepare their answers or coach each other on what to say. They were given unlimited time to speak. Of the nineteen interviewed, three were Asians (two Pakistani and one Bangladeshi), one Ghanaian, two white American-born (one of whom was raised in Zambia), one biracial Iranian-American, one biracial Indian-American, two African American, one HaitianAmerican, three Jamaican, one Trinidadian, and four Dominican. They were asked about their countries of origin, the languages they were raised speaking, and other countries they had visited. They were also asked to reflect on the diversity of our writing center, what they have learned from working in it, and in which ways their cultural backgrounds have impacted their work.
Who We Are and Why It Matters • 20 They were also given the opportunity to sing, play an instrument, or recite poetry. Music is another language, another form of diversity. We filmed Daniel Tehrani playing setar and drums, Corey Spencer and Stefan Nunez reciting original spoken word pieces, and Penelope Meyers playing the flute. The editing process of the film eliminated segments that did not provide information relevant to the question being explored. Responses were grouped into the categories shown in the film, the most compelling responses surviving the cut. Out of the interviews emerges the unique identity of the BCC Writing Center—who we are. What is most remarkable is that although they were interviewed separately, many observations are almost identical. Every interviewee comments on the ability of everyone in the center to get along with each other despite their varied cultural backgrounds. They seem to feel that it is partly because of their cultural differences that they are very interested in learning about each other, and that seems to strengthen their friendships. They all say they love working in such a diverse environment and have learned so much from each other. They are impressed with each other’s customs and knowledge of other languages and say working together inspires them to learn each other’s languages. Tutor Sam Kimball notes that he always has questions, such as, “How do you say this in Farsi? Or that in Urdu or Spanish?” (Who We Are, unedited version). Not a single respondent says that diversity doesn’t matter; rather, it enhances their love of and commitment to each other and their work. “Diversity?” asks a tutor named Britney Francis. “I could tell you that for lunch time! We have curry, plantains, arroz, pizza, special desserts . . .” (Who We Are, unedited version). Yet there is more. Besides languages, music, and traditional dishes, they each bring unique values to the cultural mix, values such as a fierce appreciation and “value for education” taught in Jamaica (Assistant Director Kenisha Thomas, Who We Are), or “extreme politeness” taught to children in Ghana (Receptionist Rejoice Nanor, Who We Are, unedited version). We are a cazuela, a cultural stew. However, as tutor Betty Doyle points out, “We are extremely interested in our differences, but we are blown away with our similarities because we are so much alike” in personalities, interests in relationship struggles, careers, hopes and dreams, but also writing, art, music, science, poetry, politics, religion, and—of course—food (Who We Are)! They all express pride and amazement at the smooth operation of our writing center. They joyfully claim, “We all just seem to fit together” (Doyle and
Thomas)! “Even though we are diverse, we all get along really well” (Nunez, Who We Are, unedited version). The writing center—a safe place for freedom of expression and exploration of unique, unusual ideas—is the great equalizer.
Why It Matters In a survey of university writing centers in the Northeast of twelve responding colleges, seven indicate that their campus is 90% white and that they wish it were more diverse. Most agree that diversity does matter because it helps bring about a greater understanding of others. However, some argue that the diversity of a writing center staff shouldn’t be an issue because tutoring, by its very definition, requires connecting to others on a universal human level (NEWCA 2013 Survey). Is it important to be able to look past race, ethnicity, or other diversities? Writing centers that are not ethnically diverse undoubtedly share the same collaborative culture as the Bronx Community College Writing Center. Therefore, the question remains: Does diversity matter? The BCC tutors all acknowledge the importance of collaborative culture in a writing center when they speak about the patience and non-judgmental acceptance they learned in “creating an environment where knowledge can be found,” as tutor Corey Spencer observes (Who We Are). Tutor Shazia Khan suggests, “Teaching and learning is give and take. I can teach something or I can learn something from a student” (Who We Are, unedited version). However, the BCC Writing Center tutors, without exception, all argue that diversity helps to better serve the students by creating a safe environment, “an environment of acceptance,” as Doyle notes about age diversity and the needs of older students (Who We Are). Tutor Noman Jalal observes that his background helped him to explain things to a Bangladeshi student in ways that another tutor, not from his culture, could not (Who We Are). Diversity helps create a feeling of comfort for students of varying backgrounds who come to the writing center. Tutor Aisha Sidibe reflects, “Students come to the writing center feeling very afraid, and it helps to see someone from the same background as you” (Who We Are). Similar conclusions could certainly be drawn by anyone in any writing center anywhere. Even for writing centers that are less ethnically and racially diverse, there are so many other kinds of diversity to consider—age, sexual orientation, and socioeconomic status, to name a few. The interviews with the Bronx Community College tutors and staff reveal their deep understanding and
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Who We Are and Why It Matters • 21 love of the diverse environment in which they work, the haven it provides, and the population who depend upon its services for support with their writing. Jalal muses, “Wherever I will go, I will have this experience of the writing center in my mind because it has changed my life” (Who We Are). The picture of a true learning community emerges in the film’s depiction of the ongoing give and take of teaching and learning. We have included the film as part of our introduction to the writing center for scheduled tours of classes. The students and faculty see the tutors’ commitment, caring, and humility. They have commented that it makes them feel welcome and that some of their misgivings have fallen away. Therefore, in creating communities of acceptance in our writing centers, is it looking past differences or embracing them that makes for excellent tutoring? The story of migration from one geographical location to another is part of the tapestry of human history. Our planet is increasingly globally accessible. There is an abundance of opportunity and a dire, growing need to learn from and about each other. As tutor Miguel Gil says, “In sharing our diversity, we acquire greater diversity by sharing who we are” (Who We Are). Still, writing centers everywhere do share the commonality of a collaborative, accepting, and inclusive culture; and indeed, we must see not only the differences but also the universal humanity of all who enter our centers. If the tutors are learning that, then even those who are working in less diverse environments will be better prepared for the day when they find themselves in the global and ethnic mix all nations are becoming—the day they realize, as the BCC tutors have learned, that “in relating with others, in relating with the world, we find out who we are” (Gil, Who We Are). Without a doubt, that day will come. Works Cited Robertson, Janet. "NEWCA 2013 Film Survey." University of New Hampshire , April 2013. Survey. Who We Are. By Jan Robertson and the Bronx Community College Writing Center tutors. Prod. Jan Robertson. 2013. Richard Martinez, Videographer, Bronx Community College Media Technology Department Film.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Praxis: A Writing Center Journal • Vol 15, No 1 (2017)
“AT FIRST IT WAS ANNOYING”: RESULTS FROM REQUIRING WRITERS IN DEVELOPMENTAL COURSES TO VISIT THE WRITING CENTER Wendy Pfrenger Kent State University wpfrenge@kent.edu
Rachael N. Blasiman Kent State University rvolokho@kent.edu
Abstract From fall 2013 through spring 2016, 1,301 students were enrolled in composition courses on our regional campus, with 349 of these enrolled in developmental courses. Our writing center serves approximately 14% of the campus population every year, a number we have seen increase since two professors in 2013-2014 began requiring students in their developmental courses to attend a minimum number of writing sessions each semester. The D-Fwithdrawal rates for developmental writing courses on our campus have averaged 32.7% over the past six semesters, an improvement over previous years. Analysis of data from a study of student outcomes during this period demonstrates that requiring frequent visits to the writing center in early semesters results in a statistically significant, positive relationship with increased passing rates and voluntary usage of the writing center.
Our writing center is located within a multi-subject learning center on a regional campus of an openenrollment, land-grant public institution. The campus mission is to bring higher education to an age-diverse, rural student population in an Appalachian corner of Ohio, yet the fact that 81% of our students require remediation of some kind as entering freshmen suggests the challenges and contradictions inherent in such a mission. Despite the favorite adage of writing centers, “better writers, not better writing,” (North 438) our writing center staff are acutely aware of the high stakes for writers coming through our doors. Their need to succeed is driven not only by a desire for higher learning, but also motivated by immediate economic necessity. We hope that we are helping . . . but until now we have not attempted to examine quantitatively whether what we observe anecdotally is, in fact, suggested by the data. In this article, we will describe a quantitative analysis of the progress of students enrolled in developmental writing courses at our university, comparing the results of those required to attend writing center sessions to those who were not required to attend the writing center. In particular, we are interested not only in student perception of the impact of writing center sessions, as previous studies of required visits have documented (Bishop; Clark; Gordon) but in examining the progress of students as they become better academic writers. In a meta-analysis of writing center research, Casey Jones described the multidimensional challenges of assessing writing center efficacy. Jones observes that
James Winter Kent State University jwinter2@kent.edu
“the most intractable problem involved in trying to evaluate the relative effectiveness of writing centers lies in the personal, subjective nature of the phenomenon under scrutiny and the problems that this raises in terms of evaluation criteria,” adding that measuring “gains” in writing ability, in addition to requiring subjective judgment, also becomes problematic in light of the writing center’s pedagogical emphasis on the writer rather than the writing product (8). Along with struggling with what, exactly, to assess (writer or product), earlier attempts at quantitative assessment of efficacy have also struggled with issues of sample size and the controlling of such variables within the inherently subjective context of writing instruction, tutoring, and evaluation. Such studies have frequently resulted in conclusions that writing center interventions show no statistically significant impact on global improvement in writing (Sadlon; Sutton and Arnold), although in at least one study (Sadlon) students demonstrated improved ability in the area of higher-order concerns (ideas and organization), and in others students demonstrated improvement in mechanical skills (grammar, punctuation) and/or testtaking ability (David and Bubolz; Naugie; Wills). The use of grades as metrics, given this context, may be problematic for obvious reasons. On our campus, professors participate every semester in a portfolio review process for all students enrolled in the first of our developmental writing sections. In this way, we ensure that, for the most part, our department offers a degree of consistency in rubrics and standards between course sections. Completing a course successfully with one professor means that a student is very likely prepared for the standards of the next course, regardless of who teaches it. However, the problems described above with regard to measuring “improvement” rather than performative competence seemed to us a few steps removed from the central question we are asking: are the writers we work with better able to achieve their goals in the university as a result of visiting the writing center? Given this context, focusing on the writer rather than the writing product seemed to us an approach more likely to yield insight. This focus required examining evidence of the writer’s adaptation to the university, including successful
“At First It Was Annoying” • 23 completion of required writing courses (not simply grades), shifts in the writer’s attitude and approach to academic writing, and sustained pursuit of progress as a writer through repeated visits to the writing center. As we are interested not only in the institution’s goals of educating students but also in the students’ goals of making progress toward degree attainment, we have focused on evidence of progress within the university in addition to shifts in attitude and behavior in a large sample of students, using university data as well as our center’s records. For this study, we have determined grades and course completion to be appropriate indicators of success along with repeated (voluntary) use of writing center appointments. A 2006 Department of Education study determined that students who have successfully completed college composition credits by the end of their second year of college graduate from four-year institutions at a rate of 82.3% as opposed to 53.4% of students who have not completed composition credits in that time (Adelman 63). Such statistics document the power of forward progress while leaving the mechanism of such progress frustratingly obscure. Rather than determining efficacy through the evaluation of writing sessions or ethnographic case studies, this study then examines efficacy in terms of improved academic outcomes in composition courses during the semester when the student used the writing center and increased usage of the writing center in later semesters when not required to do so. Although we cannot account with these measures entirely for those students who, for example, improve as writers but are still unsuccessful in completing a course, at the very least, together these two measures correspond to our beliefs that better writers tend to be those who 1) continue to seek help when they need it, whether or not they are required to do so, and 2) also tend to succeed academically at higher rates than those who do not adopt the practices of engaging in extended, collaborative composition processes. The limitations of such assessment criteria, however, are obvious when one considers that those most motivated to visit the writing center are likely to be engaged in other efforts that impact their success. To some extent, these limitations may be mitigated by the fact that on our campus some sections of developmental composition courses require students to attend writing center sessions while others do not; thus, we may compare the outcomes and behaviors of students who are required to make use of the writing center with those of students not required to make use of the writing center. We will briefly discuss later the comparative merits of required sessions as opposed to voluntarily initiated sessions, but for now it is sufficient
to say that this degree of control was useful in establishing groups of students with roughly comparable academic and motivational characteristics. It is also limiting that we cannot account with these methods for those students who improve as writers but do not succeed in progressing through their required courses; however, as both universities and students place high value on forward progress, we feel it is appropriate to use passing grades as an indicator of success in this study. From the fall of 2013 through spring of 2016, 1,301 students were enrolled in composition courses on our campus, with 349 of these being writers in developmental courses. Our writing center serves approximately 14% of the campus population every year, a number we have seen increase since two professors in 2013-2014 began requiring students in their developmental courses to attend a minimum number of writing sessions each semester. The D-Fwithdraw (DFW) rates for developmental writing courses on our campus have averaged 32.7% over the past six semesters, an improvement over previous years (Kent State University Research Planning and Institutional Effectiveness). Writers in developmental courses are those who have been identified by the university as in need of remediation through standardized testing or through qualitative evaluation of their previous writing. These writers are placed in a two-semester version of our first college writing course (College Writing I) in the belief that a slower pace will allow for greater attention to building the foundational reading and writing skills needed for their university education. This twosemester course sequence (“College Writing I Stretch”) is followed by our second-tier course that is taken by all students intending to complete a four-year degree. As many have suggested, the category of basic or developmental writing is necessarily difficult to define, though in general writers whose work has been categorized in this way may share in common challenges with reading, interpreting, and writing in an academic context due to linguistic, educational, and cultural differences (Lunsford; Matsuda; Smith; Sternglass). On our campus, typically these writers lack adequate college preparation in their educational background and linguistically may be distinguished by local dialectical variation. We approached this study with the belief that there is a significant and positive correlation between frequent visits to the writing center in early semesters and greater academic success for students enrolled in developmental writing courses. We hypothesized that 1) students who were required to go to the writing center in their first semester would be
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
“At First It Was Annoying” • 24 more likely to visit the center in subsequent semesters; 2) students with lower baseline ACT Reading scores (as compared to baseline ACT English scores) would be less likely to earn a passing grade in their first writing course; 3) students enrolled in courses with required writing center sessions would be more likely to pass these courses than students who were enrolled in courses without such requirements; 4) students who were required to visit the writing center and did so would be more likely to pass the course than students who were required but did not visit the writing center; 5) there would be no difference in course outcome between students who did not visit the writing center, regardless of whether they were required to visit or not required.
Methodology Quantitative data regarding student baseline standardized testing scores, course placement, course outcomes, and writing center usage were collected for fall and spring semesters from the 2013-2016 academic years at the Kent State University Salem campus. During this period, 1,301 individual students were enrolled in college writing courses that included developmental sections of our first-tier writing course, regular sections of first-tier writing, and our second-tier writing course. Of the 1,301 participants in this study, 16% of students self-identified as first generation. Students taking developmental writing classes accounted for 26.7% of the sample in the first semester of the study and 29.9% in the second semester. Only 15.8% of the participants were required to visit the writing center in the first semester of data collection while 24.0% of participants were required in the second semester. The data was analyzed in order to determine the relationship between writing center usage and successful course completion, with success defined as a passing grade of C- or better. Other outcomes (Ds, Fs, and withdrawals) were defined as not passing in order to be consistent with the university’s institutional research data. The study examined variables including required/not-required writing center usage, baseline test scores, and the frequency of writing center sessions in a given semester. Qualitative data in the form of student interviews was also collected in the 2016-2017 academic year. Although these interviews were recorded face-to-face, students were guaranteed anonymity if desired. Survey participants were selected on the basis of participation
in at least one writing center session. In all, 10 students were interviewed for this study. Survey questions used open-ended questions as well as Likert scale ratings. Due to the small sample size, we have not referenced the Likert scale ratings for this study. The survey may be found in Appendix A.
Results General Descriptive Information Table 1 (see Appendix) summarizes course outcomes for the first two semesters of writing classes for students registered. Data from successive semesters is not included in this table due to low sample size; that is, few students registered for more than two semesters of writing. Students with a grade of C or better received a P (passing) grade. We also report grades of D and F, withdrawals (W), students who stopped attending (SF), as well as students who registered for the course but never attended (NF). As seen in Table 1, students were more likely to pass their secondsemester writing course (69.4% versus 79.6%) and less likely to withdraw (14% versus 8.5%). Table 2 (see Appendix) provides data on the number of visits to the writing center over the course of four successive semesters of writing classes. Visits per student were lowest for the first semester, peaked in the second semester, then decreased from that point. Means are low due to the large number of students who did not utilize the writing center at all. Hypothesis Testing: Whole Sample We hypothesized that students who were required to go to the writing center in their first semester would be more likely to visit the center in subsequent semesters. To test this hypothesis, we used a one-way analysis of variance (ANOVA). Our independent variable was required/not-required to visit the writing center during the first semester, and the number of visits in the second, third, and fourth semesters were used as dependent variables. We found that students who were required to visit the writing center in their first semester were more likely to visit in their secondsemester course (F [1, 414] = 154.27, p < .001), more likely to visit the writing center in their third-semester course (F [1, 59] = 10.9, p = .002), but not more likely to visit in their fourth-semester course (F [1, 9] = 0.22, p = .65). However, the fourth semester suffers from a very small sample size of only 10 students. Taking these results together, we find strong support for the hypothesis that students who are required to visit the writing center in their first semester are more likely to visit in subsequent semesters.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
“At First It Was Annoying” • 25 We also hypothesized that students with lower baseline ACT Reading scores (as compared to baseline ACT English scores) would be less likely to earn a passing grade in their first writing course. Over half of the students in this study took the ACT, and the remaining students completed an in-house placement test. For those students who did take the ACT (N = 795), the sample mean for Reading was 20.75 (SD = 4.79) and the sample mean for English was 19.08 (SD = 4.66). We used a one-way ANOVA to compare ACT scores between students who passed their first writing course and students who did not pass. Both the ACT English score and the ACT Reading score significantly predicted passing rate (English: F [1, 794] = 22.58, p < .001; Reading: F [1, 794] = 8.23, p = .004). As expected, ACT English was a slightly stronger predictor of pass rate than ACT Reading. Figures 3 and 4 (see Appendix) show the difference in ACT scores between students who passed and did not pass their first writing course. We also used a one-way ANOVA to compare the number of visits to the writing center between students who passed their first writing class and those who did not. As seen in Figure 5 (see Appendix), students who passed their first writing course visited the writing center at a higher rate than students who did not pass, although this difference was not statistically significant when using the entire sample, which included students not in developmental courses (F [1, 1298] = 2.87, p = .09). Hypothesis Testing: Developmental Writing Courses In addition to whole sample analysis, we also examined students who were only enrolled in developmental writing courses during their first two semesters (N = 348). Although the whole sample analysis did not reach significance, we found that students enrolled in developmental writing courses who passed their first- and second-semester courses made significantly more visits to the writing center compared to students who did not pass (F [1, 347] = 10.86, p = .001). Figure 6 shows the results of students’ first-semester outcomes, Figure 7 collapses outcomes into pass/fail, and Figure 8 displays secondsemester outcomes (see Appendix). Not all instructors who taught developmental writing courses required their students to visit the writing center. We hypothesized that students enrolled in courses in which the writing center was required would be more likely to pass the course than students who were enrolled in courses in which the writing center was not required; that is, that requiring students to visit the learning center would positively impact course outcomes. However, this hypothesis was only
partially supported. Using a chi-squared analysis, we found that students who were required to visit in their first-semester course were no more likely to pass than students who were enrolled in courses in which going to the writing center was not required (χ2 [350] = 0.13, p = .72). However, we found that students who were required to visit the writing center were more likely to pass their second-semester writing course (χ2 [194] = 13.22, p < .001). We then further concentrated our sample to only examine students enrolled in developmental writing courses that required visits to the writing center (i.e., we eliminated students enrolled in courses in which the writing center was not required). Of this subset of students (N = 194), 77 students visited the writing center at least once in the semester and 117 did not. We hypothesized that students who were required to visit the writing center and did so would be more likely to pass the course than students who were required but did not visit the writing center. This hypothesis was supported. As predicted, students who were required to visit the writing center and did so were significantly more likely to pass the course than students who were required but did not visit the writing center (χ2 [194] = 10.54, p = .001). Only 48% of students who did not use the writing center when required passed the course while 71% of students who used the writing center when required received passing grades. In addition to simple pass/fail, we also examined specific grade outcomes. Students who were required to use the writing center and did so were not only more likely to pass the course, but less likely to withdraw or stop attending the course (χ2 [194] = 13.78, p = .008). Finally, we hypothesized that there would be no difference in course outcome between students who did not visit the writing center, regardless of whether they were required to visit or not required. Of students enrolled in the developmental writing courses in the first semester, 255 students made no visits to the writing center. We examined the course outcomes of students who were required to visit the writing center but did not versus students who were not required to visit the writing center and did not. We expected to find no difference in pass/fail rate between these two groups, and this hypothesis was supported for the pass/fail rate (χ2 [225] = 1.32, p = .25).
Discussion To summarize the major findings of this study: 1. Students required to visit the writing center in their first semester of taking a writing class were more likely to visit the writing center in future semesters.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
“At First It Was Annoying” • 26 2. The ACT English and ACT Reading both predicted passing rates for the first semester of taking a writing class, with English being a slightly stronger predictor than Reading. 3. Students who passed their first writing class visited the writing center more than students who failed. This was true for the entire sample and for students enrolled in developmental writing courses. 4. For students in developmental writing courses, requiring students to visit the writing center did not predict first semester pass rate, but did improve their second semester outcomes. 5. For students in developmental writing courses, students who were required to visit and did so were more likely to pass than students who were required to do so and did not visit. There was no difference between students who were required and did not attend writing center sessions and those who were not required and did not. 6. A lower percentage of students had to retake their second semester course. Inversely, our rate of retention in second semester writing courses was higher, and the “stopped attending” failures were almost cut in half. Overall, these results suggest a significant and positive relationship between requiring visits to the writing center in early semesters (the first two semesters of our campus’ writing sequence) and greater academic success for students enrolled in developmental writing courses throughout their time in the sequence. A key point of emphasis here is that merely requiring students to visit does not inevitably result in greater success; some students will of course fail to visit whether or not they are required to use the writing center. However, requiring students to visit does cause students to visit more often, pass at higher rates, and return to use the writing center in later semesters at higher rates, even when they are no longer required. Our findings show that the ACT English and ACT Reading both predicted passing rates for the first semester writing class, with English being a slightly stronger predictor than Reading. This finding stands in contrast to findings of some other studies which found that ACT English and Reading scores were not reliably predictive of student passing rates (Hassel and Giordano; Belfield and Crosta). However, a 2012 report by the ACT Report Research Series (Radunzel and Noble) produced results consistent with ours.
Students, who “hit” their Reading benchmark, a score of 18-21 or lower, were more likely to achieve a B average GPA (9) in their first year of college. Grade outcomes for those who did not “hit” their “marks” was in turn twenty percent lower (18). As with several similar studies using ACT and SAT scores,1 the 2012 findings indicate the lower the test scores, the lower the rates of satisfactory course completion and graduation rates (48). For economic and personal reasons, our students cannot afford a lengthy, yearslong graduation process. In order to avoid this, they need to learn, in this case, writing and reading skills in the allotted time to pass the required courses of our writing sequence. Perhaps our writing center aids such progress because our tutors are trained to focus on higher-order writing concerns emphasized in the latter half of our campus writing sequence, namely in College Writing I Stretch and College Writing II. The predictive power of ACT scores and their use in writing course placement for entering freshman leads us to recommend that instructors require multiple visits for all students placed in developmental writing courses. And, perhaps more drastically, not complete their degree. We suspect that repeated visits not only helped students become better writers through instrumental support, but also allowed students to feel more comfortable in the writing center itself. We interpret these results to show in part that continual utilization of the writing center by students in developmental writing courses shifts educational attitudes and behaviors in advantageous ways. Although required visits may present challenges both practical and pedagogical, as Jaclyn Wells has pointed out (2016), the research strongly suggests the student-perceived benefits of required tutoring outweigh these challenges, but we must implement required sessions with nuance in order to achieve the benefits suggested in the data above. During interviews, one student, who was required to attend the writing center only once, spoke of the process of her shift in attitude: “I was required to go,” she said, “but . . . I found it useful. So, I kept coming not only because I was required, but it gave me . . . direction on what to fix and if I was on the right track.” This observation voices a student perspective that we believe is demonstrated in the data above as well as in previous studies (Bishop; Gordon): requiring students to attend can often result in subsequent voluntary attendance, and in fact 81% of students in Barbara Lynn Gordon’s study recommended that instructors require future classes of students to attend writing center sessions (157). Although Wendy Bishop has noted the tendency of students to regard required sessions as deserving some form of “recompense” (37)
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
“At First It Was Annoying” • 27 and Gordon acknowledges the logistical as well as pedagogical challenges inherent in required visits (159161), the consensus among RAD (replicable, aggregable, data) studies of writing centers appears to confirm a positive student reaction to required visits. As students become more comfortable with the writing center, they also adopt a more sophisticated understanding of writing and of themselves as writers, due to sustained interaction with writing tutors. Several other students surveyed spoke of the writing center as a place for “guidance,” especially noting how it either supplemented course lessons or provided benefits the students found lacking in the classroom. Student 2 told us: “Last semester, I wasn’t really getting much feedback. So [visiting the writing center helped] me to know what to do . . . It was [sic] guidance.” Establishing the writing center as a comfortable place, one of facilitation instead of punishment, where students can have agency over their work and their time in a way unlike the classroom, lays the groundwork for building better student attitudes and the development of writing skills needed for future classes in our campus’ writing sequence. Student 1 commented: “I was kind of annoyed at first, but then I realized it was helping.” Other students similarly noted that being required led them to do something they had initially found either unappealing or impractical, and that they were likely to continue using the writing center as a resource in future courses. Brian Huot has suggested that good writers tend to be able to evaluate writing well, and that instruction that produces good writers “requires that students and teachers connect the ability to assess with the necessity to revise, creating a motivation for revision that is often difficult for students” (174). We interpret responses like those above as well as repeated, voluntary visits to the writing center to be evidence of students responding to the combination of informal evaluative conversations with tutors and more formal evaluative responses from their instructors. They are becoming better at assessing writing and thus better—and more engaged—in revision. Beyond a general sense of goodwill towards tutoring sessions, two other students mentioned specific writing skills in their responses. Such responses offer insight into how the writers themselves link their progress as writers with their evolving (and portable) skills: Student 3: I think the skills they’ve shown here will help in other classes . . . The different revision, the reading backwards, reading it out loud, just all the things [like] the printing it out and being able to mark on it. I think it was helpful because a lot of the tutors
! have had the class before me and they understand what the prompt is and they have different ways that they went about it. Student 4: I like the way [the tutor] had just the little things about how to change my sentences and reword them. [The tutor] just showed me the development of my paper and how to change my paragraphs. It helped me realize what needed to be changed in my structuring and grammar in my papers. Here we see emerging for these writers the notion that later writing projects will be continuous with the experiences they have had as writers in their early developmental courses and in the writing center. We also find in these interviews an awareness that tutors are writers who, like the writing center clients, have developed flexible, unique approaches to thinking about the assigned coursework. In her analysis of basic writers’ motivation, Heather Robinson observed that “through a series of tutoring sessions, students whom we might consider to be basic writers show movement towards seeking assistance with those types of writing skills that we would associate with student writers who have stronger skills, and who thus do not fit the basic writer profile so readily” (76), an observation that reinforces the power of repeated visits. These writers understand writing as fundamentally social, complex behavior requiring engagement with other writers, and it is this understanding, we believe, that not only assists instrumentally with improved writing, but advances their progress as writers well beyond their initial developmental courses. In seeing success, they gain confidence and agency. Writing center visits no longer become a hindrance, but habit, a fully integrated part of their learning, writing, drafting, and revising process. By gaining a measure of control and further awareness of how academic writing works, they then do not so readily throw in the towel when faced with more difficult assignments in later writing courses. Evidence of this can be seen in our campus’ DFW rates; students required to use the writing center were less likely to withdraw or stop attending their developmental course. Our student interviewees mentioned being “crunched for time” and being concerned that required visits “might be a waste of time,” yet after a semester of visiting the writing center they called such requirements a “good thing.” One even went further to say that: Student 4: I think even through College Writing 2 that they should require you to [go to] the Learning Center because when they’re not forced to I feel like they still wait [until] the last minute and their writing isn’t as good. I have . . . friends . . . that have a lot of trouble
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
“At First It Was Annoying” • 28 and if they come in once and don’t get the help they want they don’t tend to go back and they might fail or they might not do as well, so I think it’s very helpful for them to be required to do it. Note that this supports the notion that one visit at the writing center is not enough. Again, those students who showed greatest success had multiple required writing center visits and were therefore more inclined to use the center in future semesters and even recommend it to their peers. Obviously, shifts in attitude and behavior toward education take time. To this end, the student response above begs the question of whether instructors should require writing center visits beyond the first-tier courses. As per our results, those students who visited the writing center during their second-tier course were more likely to pass than those who did not. During our second-tier writing course, students not only have to read more complex texts, but they must complete extensive research assignments. Although not necessarily new to them, these research-driven assignments, due to their length and instructor expectations, can be seen as overwhelming and even brand new. As indicated above, students feel that although tutors give them helpful advice about specific aspects of their work, most significant of all is the feeling of a writing support system, a place for guidance and even refuge, a safe space to discover, as David Bartholomae describes, “that the errors in [their] writing fall into patterns, that those patterns have meaning in the context of [their] own individual struggle[s] with composing, and that they are not, therefore, evidence of confusion or a general lack of competence” (86).
Conclusion A 2010 survey of remediation in California’s extensive community college network marks as two of its key recommendations the suggestions that 1) there should be greater uniformity of practice among remedial instructors within institutions and 2) there should also be greater focus on providing individualized support for these students (Grubb). The study concludes by observing, “One of the enduring problems in remedial classes, therefore, is how to impose adequate demands on students while simultaneously providing the moral and academic support so that they will continue their education” (13). On our campus, support is paramount to achieving successful educational outcomes. As previous studies of required appointments have noted, the primary reason given by most students for not using the writing center is a lack of time (Bishop; Clark;
Gordon). Many of our students work part- or full-time jobs while juggling the challenges of school, finding childcare, paying bills, et cetera. It is the usual regionalcampus story. Many of our students feel as if they have been left out of the national conversation for, in some cases, several generations, making exposure to the unfamiliar university setting emotionally and socially fraught. In addition, older, nontraditional students sometimes have to make a technological leap to be able to have a chance of keeping up with their coursework. Writing center appointments do provide them with space to learn and practice writing skills, but they also allow our students to see that college itself is not a set of rigged hoops through which they must jump. Requiring appointments puts them in extended contact with other students with similar backgrounds who can model skills and translate goals, make those goals seem less imposing, and assist writing clients in finding their sense of agency. With an understanding of the goals and pedagogical approaches taken in these courses and with a sense of agency cultivated by peer tutoring, these students are better positioned to succeed in the university. Further study is needed, particularly with regard to whether and how writing center usage impacts persistence and degree completion for these students; a more nuanced examination of how motivation and agency shift over time for students who are required to attend writing center visits would also be useful. The finding that requiring students to participate in writing sessions can have a positive and significant impact on their development as writers also has implications for tutor training, which must be taken into consideration. How can we prepare tutors to shift the dynamic that results when a student fulfills a requirement rather than actively seeks assistance? Which narratives do we construct in collaboration with these students about their progress and agency as writers? We would suggest that requiring students enrolled in developmental writing courses to take advantage of the support available in writing centers improves the likelihood that they will meet the rigorous demands of academic writing courses while also, perhaps contrary to expectation, strengthening their sense of agency. Particularly on small campuses like ours, a minor shift in practice may have magnified, visible effects on campus climate and improved retention rates. Most significantly, it allows us to better fulfill the promise of our open-enrollment mission, extending higher education to a community in need of greater access and opportunity.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
“At First It Was Annoying” • 29 Notes 1.
See Allen et al.; DeAngelo et al.; Sackett et al. Works Cited
Adelman, Clifford. The Toolbox Revisited: Paths to Degree Completion From High School Through College. U.S. Department of Education, 2006. ERIC. http://files.eric.ed.gov/fulltext/ED490195.pdf Allen, Jeff, et al. “Third-Year College Retention and Transfer: Effects of Academic Performance, Motivation, and Social Connectedness.” Research in Higher Education, vol. 49, no. 7, 2007, pp. 647-664. ERIC. http://web.a.ebscohost.com.proxy.library.kent.edu /ehost/pdfviewer/pdfviewer?vid=66&sid=44bed7 c1-8f65-4691-9c0e60729a745c6d%40sessionmgr4009 Bartholomae, David. “Teaching Basic Writing: An Alternative to Basic Skills.” Journal of Basic Writing, vol.2, no. 2, 1979, pp. 85-109. ERIC. https://wac.colostate.edu/jbw/v2n2/bartholomae .pdf Belfield, Clive, and Peter M. Crosta. “Predicting Success in College: The Importance of Placement Tests and High School Transcripts.” Working Paper. Community College Research Center. Columbia U, Feb. 2012. ERIC. http://www.eric.ed.gov/contentdelivery/servlet/ ERICServlet?accno=ED529827 Bishop, Wendy. “Bringing Writers to the Center: Some Survey Results, Surmises, and Suggestions.” The Writing Center Journal, vol. 10, no. 2, 1990, pp. 3144. The Writing Centers Research Project. http://casebuilder.rhet.ualr.edu/wcrp/publications /wcj/wcj10.2/10.2_Bishop.pdf Clark, Irene Lurkis. “Leading the Horse: The Writing Center and Required Visits.” The Writing Center Journal, vol. 5/6, no. 2/1, pp. 31-34. The Writing Centers Research Project. http://casebuilder.rhet.ualr.edu/wcrp/publications /wcj/wcj5.2_6.1/wcj5.2_6.1_clark.pdf David, Carol, and Thomas Bubolz. “Evaluating Students’ Achievements in a Writing Center.” The Writing Lab Newsletter, vol. 9, no. 8, Apr. 1985, pp. 10-14. DeAngelo, Linda, et al. Completing College: Assessing Graduation Rates at Four-Year Institutions. Higher Education Research Institute at UCLA, 2011 https://heri.ucla.edu/DARCU/CompletingColleg e2011.pdf.
Gordon, Barbara Lynn. “Requiring First-Year Writing Classes to Visit the Writing Center: Bad Attitudes or Positive Results?” Teaching English in the Two-year College, vol. 36, no. 2, Dec. 2008, pp. 154-163. Grubb, W. Norton. “The Quandaries of Basic Skills in Community Colleges: Views from the Classroom.” NCPR Developmental Education Conference, 23-24 Sep. 2010, New York, NY, National Center for Postsecondary Research, Sep. 2010. ERIC. https://eric.ed.gov/?q=The+Quandaries+of+Basi c+Skills+in+Community+Colleges%3a+Views+fr om+the+Classroom&id=ED533875 http://files.eric.ed.gov/fulltext/ED533875.pdf Hassel, Holly, and Joanne Giordano. "The Blurry Borders of College Writing: Remediation and the Assessment of Student Readiness." College English, vol. 78, no. 1, 2015, pp. 56-80. National Council of Teachers of English. http://www.ncte.org.proxy.library.kent.edu/library /NCTEFiles/Resources/Journals/CE/0781sep2015/CE0781Blurry.pdf Huot Brian. “Toward a New Discourse of Assessment for the College Writing Classroom.” College English, vol. 65, no. 2, Nov. 2002, pp. 163-180. doi: 10.2307/3250761 Jones, Casey. "The relationship between writing centers and improvement in writing ability: an assessment of the literature." Education, vol. 122, no. 1, 2001, p. 3+. Opposing Viewpoints in Context, link.galegroup.com/apps/doc/A80856249/OVIC? u=txshracd2598&xid=b60f12bd. Accessed 27 Nov. 2017. Kent State University Research Planning and Institutional Effectiveness [RPIE]. Grade Distribution Report Fall 2013 ENG 01001, ENG 01002, ENG 11011, Salem Campus. Data File. Kent State U, 2016. ---. Grade Distribution Report Spring 2014 ENG 01001, ENG 01002, ENG 11011, Salem Campus. Data File. Kent State U, 2016. ---. Grade Distribution Report Fall 2014 ENG 01001, ENG 01002, ENG 11011, Salem Campus. Data File. Kent State U, 2016. ---. Grade Distribution Report Spring 2015 ENG 01001, ENG 01002, ENG 11011, Salem Campus. Data File. Kent State U, 2016. ---. Grade Distribution Report Fall 2015 ENG 01001, ENG 01002, ENG 11011, Salem Campus. Data File. Kent State U, 2016. ---. Grade Distribution Report Spring 2016 ENG 01001, ENG 01002, ENG 11011, Salem Campus. Data File. Kent State U, 2016.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
“At First It Was Annoying” • 30 Lunsford, Andrea. “Cognitive Development and the Basic Writer.” College English, vol. 41, no. 1, 1979, pp. 38-46. JSTOR. doi: 10.2307/376358 Matsuda, David Kei. “Basic Writing and Second Language Writers: Toward an Inclusive Definition.” Journal of Basic Writing, vol. 22, no. 2, 2003, pp. 67-89. ERIC. Naugie, Helen. “How Georgia Tech’s Lab Prepares Students for the Georgia Mandated Proficiency Exam.” The Writing Lab Newsletter, vol. 5, no. 4, 1980, pp. 5-6. http://web.a.ebscohost.com.proxy.library.kent.edu /ehost/pdfviewer/pdfviewer?vid=29&sid=44bed7 c1-8f65-4691-9c0e60729a745c6d%40sessionmgr4009 North, Stephen M. “The Idea of a Writing Center.” College English, vol. 46, no. 5, Sep. 1984, pp. 433446. JSTOR. doi: 10.2307/377047 Radunzel, Justine, and Julie Noble. Predicting Long-Term College Success through Degree Completion Using ACT[R] Composite Score, ACT Benchmarks, and High School Grade Point Average. ACT Research Report Series, 2012. ERIC. http://files.eric.ed.gov/fulltext/ED542027.pdf Robinson, Heather M. “Writing Center Philosophy and the End of Basic Writing: Motivation at the Site of Remediation and Discovery.” Journal of Basic Writing, vol. 28, no. 2, Fall 2009, pp. 70-92. ERIC. http://files.eric.ed.gov/fulltext/EJ877256.pdf Sackett, Paul R., et al. “Does Socioeconomic Status Explain the Relationship Between Admissions Tests and Post-Secondary Academic Performance?” Psychological Bulletin, vol. 135, no. 1, Jan. 2009, pp. 1-22. doi: 10.1037/A0013978 Sadlon, John. “The Effect of a Skills Center upon the Writing Improvement of Freshman Composition
Students.” The Writing Lab Newsletter, vol. 5, no. 3, 1980, pp. 1-3. Smith, Cheryl Hogue. “ ‘Diving in Deeper’: Bringing Basic Writers’ Thinking to the Surface.” Journal of Adolescent and Adult Literacy, vol. 53, no. 8, May 2010, pp. 668-676. JSTOR. http://www.jstor.org.proxy.library.kent.edu/stable /25653927 Sternglass, Marilyn. “The Need for Conceptualizing at All Levels of Writing Instruction.” Journal of Basic Writing, vol. 8, no. 2, 1989, pp. 87-98. Sutton, Doris G., and Daniel S. Arnold. “The Effects of Two Methods of Compensatory Freshman English.” Research in the Teaching of English, vol. 8, no. 2, Summer 1974, pp. 241-249. JSTOR. http://www.jstor.org.proxy.library.kent.edu/stable /40170528 Wells, Jaclyn. “Why We Resist ‘Leading the Horse’: Required Tutoring, RAD Research, and Our Writing Center Ideals.” The Writing Center Journal, vol. 25, no. 2, Spring/Summer 2016, pp. 87-114. Wills, Linda Bannister. “Competency Exam Performance and the Writing Center.” The Writing Lab Newsletter, vol. 8, no. 10, 1984, pp. 1-4.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
“At First It Was Annoying” • 31
Appendix
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
“At First It Was Annoying” • 32 !
Note: Students who passed their first-semester writing course had significantly higher ACT Reading scores than students who failed their first-semester writing course.
Note: Students who passed their first-semester writing course had significantly higher ACT English scores than students who failed. The difference in scores between students who passed and students who failed is greater for ACT English than ACT Reading.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
“At First It Was Annoying” • 33 !
Note: Students who passed their first-semester writing course visited the Writing Center at a higher rate than students who failed their first-semester writing course.
Note: Students who passed their first-semester developmental writing course visited the Writing Center significantly more often than students who stopped attending (SF), received a D or F, withdrew from the course (W), and never attended (NF).
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
“At First It Was Annoying” • 34 !
Note: This figure collapses the results of Figure 6 to pass/fail.
Note: Students who passed their second-semester developmental writing course visited the Writing Center significantly more often than students who received a grade of D or F, withdrew from the course (W), stopped attending (SF), or never attended (NF). Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
“At First It Was Annoying” • 35 Interview Script PI/Co-Investigator: Would you be willing to participate in a brief interview regarding your visit to the Learning Center? It will only take about 15 minutes. If you’re willing to participate, please fill out this brief consent form, taking time to read it through and ask any questions you might have. You may remain anonymous in the study or choose to be identified by indicating your preference on the consent form. I will be the person who conducts the interview. Your responses will not be shared with the tutors or your professors. Interview questions: 1. What was your first writing course at the university? 2. Which course are you in now? 3. Why did you first start coming to the Learning Center? 4. If you came voluntarily, how did you hear about it and what were you told beforehand? 5. If you were required to visit the Learning Center, who told you to come and what were you told beforehand? How did you feel about being required? 6. If you were required to visit, how did your attitude about being required change over the course of the semester (if it did)? 7. How would you describe your tutors and the tutoring sessions? 8. In your tutoring sessions, did your tutors talk with you about your “outside class” reading and writing habits? If so, did they make connections between them and the things you need to do in college courses? 9. On a scale of 1-5 with 1 being not at all helpful and 5 being very helpful, how helpful would you say the tutoring was? 10. If it was helpful, what was helpful about it? 11. If it was not, why wasn’t it? What do you think would have been more helpful? 12. Have you used the Learning Center even when you weren’t required? Why or why not? 13. On a scale of 1-5, with 1 being not at all likely and 5 being very likely, how likely are you to use the Learning Center for help with future writing assignments? 14. On a scale of 1-5, with 1 being “don’t agree” and 5 being “very much agree,” please rate the following statements: The Learning Center helped me become a better writer. The Learning Center changed how I thought about the writing process. The Learning Center changed how I thought about myself as a writer. The Learning Center helped me understand university writing. The Learning Center helped me adapt my usual reading and writing skills to the new demands of university classes. (If applicable) I am glad my professor required me to use the Learning Center.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Praxis: A Writing Center Journal • Vol 15, No 1 (2017)
INSTITUTIONAL ASSESSMENT OF A GENRE-ANALYSIS APPROACH TO WRITING CENTER CONSULTATIONS Brett Griffiths Macomb Community College griffithsb09@macomb.edu
Randall Hickman Macomb Community College hickmanr@macomb.edu
Administrators of writing centers at two-year colleges keenly feel the call for improved research and greater visibility of writing “outcomes.” In the past decade, federal calls to raise rates of education attainment have inspired various initiatives to increase rates of student completion in degree and certificate programs (e.g., Achieving the Dream and the Guided Pathway initiative). At the same time, greater public awareness brings with it additional pressure and scrutiny on educational resources to demonstrate specific and measurable impacts. Chief among these— in the spotlight for education reform for over 100 years—is college writing.1 Placed within the context of college attainment goals and what has been called “accountability funding” in K-12, writing center administrators at two-year colleges find themselves again revisiting assessment and research practices with the goal of demonstrating to administrators and policymakers that writing centers improve “student success.” Identifying how “student success” is an impact of writing instruction can be difficult, precisely because writing and thinking are interactive processes, meaning that the content of written academic work often reflects a moving canvas—a place where students’ ideas are changing while they are working. Education policy initiatives often privilege statistical outcomes, such as course completion and GPA, both snapshots of performed knowledge. Meanwhile, prevailing writing pedagogies, such as those published in the Council of Writing Program Administrators et al.’s Framework for Success in Postsecondary Writing (hereafter, Framework), aim to identify and support the interactive learning process itself, including writers’ abilities to recognize rhetorical and social expectations of specific writing situations, to reflect on their writing choices and learning processes (metacognition), to create, and to adapt in response to those changing demands. However, statistical assessments do not detract from ethical teaching in and of themselves. Statistical measures become problematic when they are used to assess whole-cloth measures for ambiguous targets, such as “success,” when such targets are decontextualized from students’ own goals and the learning context in which they are immersed. Rather,
Sebastian Zöllner University of Michigan szoellne@umich.edu
using statistical measures to test, evaluate, and improve our knowledge about quality instruction can be a valuable strategy for demonstrating the quality and import of the work we do. Given the commitment of writing instruction professionals to support students in the development of habits of mind (rhetorical knowledge, reflection, writing processes) that empower students to identify expectations and transfer knowledge across situations, writing instruction professionals have a responsibility to actively seek institutional collaborations that showcase our values and evaluate our approaches. Stephen North has nicely summarized the goal of writing centers as creating “better writers, not better writing” (438). “Better writers” from the perspective of the Framework are more aware of their own practices and better able to adapt to and support themselves when faced with new writing situations. Yet, it is difficult to design and advocate for pedagogies that improve writers outside of conversations about outcomes assessment and funding. The schism between the two perspectives that writing instructional professionals face—improving student completion or developing better habits of mind, better writing, or better writers—can hamper the potential for fruitful conversations between writing pedagogy professionals and administrators. This schism can poorly position writing pedagogy experts to steer institutional conversations about writing pedagogy and supports (Toth et al.). Here, we believe we have developed a tutoring protocol and assessment that help us advocate for and implement some of the principles of current writing pedagogy while also evaluating impact in the measures preferred by our administrative colleagues. This article aims to put these two perspectives together in what Linda Adler-Kassner has called a “principled connection”—an assessment evolved from a collaboration between advocacy for disciplinary values for writing pedagogy and institutional ways of identifying and interpreting student success initiatives. Our goals here are pragmatic. Specifically, we set out to devise an approach to writing center tutoring that would foster the development of the habits of mind described in the Framework and to assess the impact of that protocol, as well as possibly to use the statistical
Institutional Assessment of a Genre-Analysis Approach to Writing Center Consultations • 37 measures preferred for institutional reporting. Therefore, this paper has two parts: a description of our writing consultation protocol and a description of our assessment outcomes. The assessment outcomes were developed and analyzed in collaboration with our institutional research office. We recognize this article cannot—and does not hope to—resolve ongoing divisions in assessments, pedagogies, and standards. Nevertheless, we argue that it is ethically incumbent upon all of us in writing studies to explain our methods to our institutional/administrative colleagues and to demonstrate the effectiveness of the individual and reflective processes for which we advocate. The authors of this article began with the premise that many writing center administrators and their institutional colleagues hold distinct expectations for the purpose and process of writing center work. These distinctions become highlighted through assessment reporting, because the goals and discourses of “outcomes” index community values associated with those distinctions. As such, we begin by articulating clearly to our administrative colleagues—and our writing center colleagues—our tutoring protocol and rationales. Through this articulation we discursively link our protocol to scholarship and recommendations in writing studies, including concepts of “uptake” and “transfer” associated with the Framework for Postsecondary Success in Writing and genre analysis.2 To illustrate our model for the purpose and process of writing center work, we developed an infographic depicting what we called the “consultation hierarchy,” (Figure 1), which placed genre knowledge— understanding the assignment and attaching each assignment to prior experiences writing similar genres—at the top of the hierarchy and placed grammar, usage, and mechanics (GUM) at the bottom of the hierarchy. Our intention with this infographic is to emphasize the importance of understanding the rhetorical situation and social expectations associated with specific writing situations first, before moving into questions about content, structure, language choice, etcetera. This decision helped the new writing center director articulate to administrative colleagues and superiors the goals of writing center work, while also acknowledging the dilemma of assessment in process-driven learning. In effect, we aimed to proactively acknowledge and reject expectations for the writing center as a location for editing or proofreading and to divorce “outcomes” from measures of grammar correctness or the grades of individual assignments. This move was rhetorically important because of the tenuous nature of writing center funding and because our emphasis was on writing knowledge that students transfer, not on the conventionality or correctness of
any single text students produced. Finally, announcing our assumptions about learning and the pedagogical principles that shaped our consultation sessions helped to more closely define the relationship between writing and knowledge transfer. Consequently, semester GPA, although not a perfect or singular measure of writing center impact, does assist us in tracking how well students are able to identify genre-specific writing expectations, to better advocate for their own learning processes, and to transfer knowledge about expectations and learning to other classes and contexts.
“Temporary Specially-Funded Strategic Initiative”: The Context for Assessment It is useful here to understand the specific environment in which this writing center was founded and funded, because our situation mirrors the accountability context in education nationally, even while featuring specific local characteristics. In 2015, Macomb Community College— a large, multi-campus, two-year, open-admissions college north of Detroit— opened facilities at two of its campuses to support college students in their academic and professional reading and writing activities. These facilities, now called the Macomb Reading and Writing Studios (RWS), serve a diverse student body, including “academically-prepared” students, as well as students who face challenges in their writing due to their academic preparation or language and dialect backgrounds. Multilingual writers whose first languages were languages other than English (e.g., Arabic, Bengali, Chaldean) comprise 21% of the students we serve, alongside a population of multidialectal speakers of English. The Reading and Writing Studios were initiated as one part of the ongoing efforts to increase rates of persistence and completion of our students. This initiative also derived from our participation as a leader college in the Achieving the Dream initiative and as a component of commitment to student success in our assurance argument to the Higher Learning Commission. As such, the director also sought to rhetorically tie the principles of writing consultations to the measurements affiliated with both Achieving the Dream and the Higher Learning Commission. The director was hired to establish a new, multicampus academic literacy center. As such, the director’s primary responsibility was to develop a “pedagogical vision” for tutoring reading and writing to monolingual and multilingual students across multiple campuses and to work with institutional research resources to develop an assessment plan for measuring impact and effectiveness. The project was strategically funded for a two-year pilot initiative with
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Institutional Assessment of a Genre-Analysis Approach to Writing Center Consultations • 38 the set goal of establishing a tutoring model that supports student persistence, while also providing regular assessment updates to the Provost, members of the Student Success Council, and the Achieving the Dream work team. Put simply, permanent funding for the center depended upon demonstrating that the devised tutoring approach had a positive impact on student success. Mindful of this context, the director developed a tutoring protocol intended to incorporate three primary principles of writing pedagogy: • prioritizing the teaching of genre knowledge • promoting students’ rights to their own language • assisting students’ development of selfregulated learning strategies (e.g., conceptualizing, planning, and completing projects using individualized learning approaches.) These primary principles align with and reinforce the habits of mind identified in the Framework, including flexibility, reflection, and metacognition. Our session protocols initially invite students to describe their understanding of the criteria for their assignment and their experiences writing similar assignments. Tutor training emphasized the use of non-evaluative language and focus on social and historical expectations associated with genre types and prior learning. Our protocol then invites students to explicitly identify that prior learning and apply it to their new writing situation. Our session protocol heavily emphasizes student agency, encouraging our tutors to use terms such as “choice,” “decision,” and “reasons” to refer to writing moves, past and present. We actively discourage using language associated with hierarchical views of language that describe Standard Academic American English as the pinnacle for writing success. We eschew terms like “correct” and “incorrect” grammar, usage, and mechanics in the publication of our services, the professional development of our tutors, and the language of our instruction. Additionally, session protocols encourage writers and tutors to conclude each consultation with a written session summary and a revision or work plan list of “next steps,” which articulate the students’ writing process and/or knowledge. For example, “next steps” might include procedural components, such as “embed sources” or knowledge components, such as “search the Internet for examples of writing similar to the assignment.” This “next steps” component of the protocol helps students to explicitly identify what they have learned in-session and encourages them to reflect
on their process and to plan the next stages of their current and/or future writing projects. Such activities prompt metacognitive thinking in which students engage in their individual writing process while planning the stages of completion and submission of their work (self-regulation). We have encountered two critiques of our protocol. One reviewer for this article seemed to suggest our approach was more directive than the current trend in writing center studies. A second critique, from a faculty member, suggested that the protocol appears rigid and inflexible. However, we argue that our protocol makes the roles of tutors as coaches and of writers as decision-makers more explicit. Moreover, we argue that the structured nature of our protocols provides an important rhetorical tool that makes visible the processes of metacognition and adaptation, which helps students to transfer their learning from one writing situation to another. Thus, the protocol emphasizes student agency, reflective learning, and individual processes, providing students with the language and knowledge to help them navigate conversations about writing and “success” outside the RWS. Taken as a whole, the articulation of our protocol and infographic aimed to bring together scholarship in writing studies, to reinforce the habits of mind identified in the Framework, and to implement learning support strategies demonstrated in developmental education scholarship for the purpose of advancing a wider institutional understanding of writing instruction methods. Future funding of our writing center depended on our ability to demonstrate impact as measured in institutional terms to learning principles aligned with writing studies. To negotiate this space, we turned to writing center assessment research. Therefore, we advertised our principles and educated our colleagues about the scholarship and rationales that undergirded them, and we consciously referred to the principles and rationales in our outreach initiatives. Simultaneously, we co-developed an assessment protocol that linked outcomes (GPA and course completion) to the learning principles we espoused and the scholarship that shaped those principles.
Methods for Writing Center Assessment Writing center administrators value and advocate for quality assessment and research on the impact of tutoring, though some disagreement remains about which methods comprise effective assessments, and which kinds of assessment are considered “research.”3 Dana Driscoll and Sherry Wynn-Perdue have found that RAD research—research that is replicable,
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Institutional Assessment of a Genre-Analysis Approach to Writing Center Consultations • 39 aggregable, and data-supported—comprises only 5.5% of all articles published in Writing Center Journal from 1980-2009. Their follow-up study found that writing center directors bring broad, and sometimes conflicting, expectations to the concept of writing center research, as well as questions about which methodologies are best-suited for such work (e.g., quantitative, qualitative). Taken together, the two articles culminated in a call for increased participation by writing center administrators in RAD research— both qualitative and quantitative—and greater attention paid by writing center administrators to the methods, implications, and limitations of each study. Neal Lerner has emphasized that writing center administrators and researchers can and should conduct their own research (“Counting Beans”). In an era of “outcomes” measurements and budget cuts, we must be more than “ticket tearers,” encouraging the identification and publication of positive impacts writing centers have on student success (“Choosing Beans” 1). Yet Lerner has also cautioned that many variables complicate quantitative assessments of writing center impacts, including but not limited to students’ existing strengths as writers beyond what can be assessed through standardized testing, such as the SAT, as well as variability in grading of assignments and courses across instructors (“Counting Beans” 3). There remain as-yet-unanswered questions about how well grades in any one course can meaningfully and consistently measure writing ability or learning, which Lerner addresses in both articles. We operate carefully within this context, recognizing both that the pressure to demonstrate “outcomes”—often defined and measured by people outside our fields—is both fierce and urgent. In many cases like ours, student outcomes are tied to potential funding. Simultaneously, we are mindful of what compositionists have coined “Ed White’s Law”—“assess thyself or assessment will be done unto thee”— and of Les Perelman’s call to recognize assessment as a rhetorical act.4 Thus, we tread carefully here but are also eager to contribute to a greater understanding of how writing centers can have a positive impact on student success. We used mixedmethod analysis of data, including student and faculty surveys, as well as observations of tutoring sessions and analysis of peer referrals to the RWS. We aimed to assess the overall function of the writing center from multiple perspectives. Here, however, we focus on the quantitative assessment of institutional outcomes associated with academic performance and persistence. Summaries of those components of assessment can be found in Appendix 1.
Sample Description and Methodology for Quantitative Institutional Assessment Our study sample included all students who registered with the Reading and Writing Studios scheduling system (WCOnline) in the fall of 2015. As a quantitative measure for student outcomes affected by tutoring in the RWS, we chose student’s semester GPA, which we collected from transcript data. We collected data for 556 students who each attended between 0 and 28 tutoring sessions.5 Using regression models, we assessed if attendance of sessions at the RWS had an impact on semester GPA. We employed two parameterizations of the key predictor, tutor session attendance, and two ways of analyzing GPA as outcome, resulting in a total of four analyses. As the two parameterizations, we employed (1) the raw number of sessions attended and (2) a discrete variable, based on the aggregate of sessions students attended: 0 sessions, 1 session, 2 sessions, 3 to 5 sessions, and 6 or more sessions. This had the effect of grouping frequency outliers (one student attended 28 sessions) into the “six or more” group. The models using the raw number of sessions naively assume that on average every additional session in the RWS has the same constant impact on GPA, that “more sessions” will have a continued and linear impact. One key problem with such an assumption is that students experiencing the greatest learning difficulties are often referred—if not required—to attend the Reading and Writing Studios as part of their learning accommodations and support. Using the second parameterization of attendance at tutoring sessions allows each session frequency group to have its own effect on GPA, but at the cost of estimating more parameters. The second effect of grouping attendance is to capture students who attended more sessions by choice and those who were required to attend more sessions into a single category of outliers who may have otherwise skewed a purely linear model of impact. We analyzed two outcomes as “impacts”: (1) semester GPA (0.0-4.0) and (2) student “success” measured more broadly, in terms of pass/fail (GPA 2.0 and above versus GPA less than 2.0). We assessed the impact of the key predictor on semester GPA using two models, both employing multiple regression. The first model parameterized attendance at sessions using the raw number of sessions. The second model parameterized attendance at sessions using the session frequency group variable. To control the impact of known confounders, we included the following variables in the models: race, gender, age, collegereadiness measures (Compass English and reading placement scores), and years since first enrollment at
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Institutional Assessment of a Genre-Analysis Approach to Writing Center Consultations • 40 Macomb (a proxy measure of amount of college experience). Research conducted at Macomb and elsewhere has shown that demographic variables such as race and gender, college-readiness measures, and amount of college experience are (sometimes strongly) associated with GPA. We assessed the impact of attendance at sessions on the pass/fail outcome using two models, both employing logistic regression, each using one of the two different parameterizations of the key predictor variable. A visual representation of the four analyses is in Table 1, below.
Summary of Results We evaluated the results of our analyses in terms of both significance and effect size, using both unadjusted and adjusted effects sizes, which control for race, gender, age, college-readiness measures (Compass English and reading placement scores), and years since first enrollment at Macomb (a proxy measure of amount of college experience). As indicated in Table 2 (see Appendix 2), students who attended more tutoring sessions had higher semester GPAs on average. Using our naïve linear model, which assessed the marginal linear impact of each session on GPA, we estimate each session attended increases GPA by 0.043 grade points (p < 0.001), with a 95% confidence interval ranging from 0.020 to 0.066 grade points. Using the discrete parameterization of attendance at sessions and statistically adjusted for the controls listed above, differences in GPA ranged from +0.33 grade points for students attending one tutoring session to +0.79 grade points for students attending six or more sessions, compared with students who registered with the RWS but did not attend tutoring sessions. The distinction in these two ways of assessing the impact of attendance at sessions on semester GPA can be seen in Figure 2 (see Appendix 2). The straight line shows the estimated impact of tutoring sessions attended, after adjusting for our covariate controls (e.g., age, race, time since enrollment, etc.) and the box graph shows the impact by session frequency group. As is evident in Table 3, the results using the analysis of impact on pass/fail outcomes were consistent with the results from modeling semester GPA directly. As with the effect-size estimates from modeling semester GPA directly, the adjusted effect size estimates were monotonically ordered: students with a greater number of sessions tended to have higher odds of “success” (semester GPA ≥ 2) on average. The estimated increase in the odds of “success,” by comparison with the odds of “success” of students with no sessions, ranged from 88% for students attending one tutoring session to a nearly
seven-fold increase for students attending six or more sessions. In the linear model which naively assumes the same additive effect on semester GPA for each session, the parameter estimate of this additive effect was an increase in the odds of “success” (a semester GPA ≥ 2) of 16.7% (OR = 1.167, p = 0.001), with a 95% confidence interval ranging from a 6.1% to a 28.3% increase in the odds of success, underscoring what we have previously described as the complications introduced by measuring educational impact using a purely linear model. These confidence intervals reflect the uncertainty in the estimates of the effect sizes. Note that for frequency groupings “1” and “2” odds ratio 1 is contained in confidence interval, indicating that we cannot exclude the possibility that 1 or 2 visits have no effect on the mean odds of “success.” For frequency groupings “3-5” and “6+,” the lower bound of the confidence interval is larger than 2.4, indicating that we can estimate with high confidence that the mean odds of “success” are increased at least 2.4-fold. This broad range of impact on students’ success represented here reflects the variable impact tutoring can have on students when reasonable delimitations are not set. As an example, students with one or more special learning plans who attended the RWS weekly or biweekly may have shown disproportionately greater or—more likely—smaller impact, given the specific learning challenges they face.
Discussion of Results The results reported here provide local evidence for a measurable impact of a writing tutor protocol that explicitly emphasizes genre analysis, rhetorical knowledge, and self-regulated learning and transfer. As in all work of this nature, factors like motivation and schedule availability are difficult to control. However, our estimated measures of effect size allow us to ask questions about how strong an influence might be and, with considered controls, they can help us to narrow the possible rationales for influences in complex environments such as education. The effect sizes produced in our analysis provide strong evidence that greater use of RWS tutoring sessions leads to greater increases in student success. In sum, the number of attended RWS sessions matter: the greater the number of sessions, the greater the semester GPA, at least up to a reasonable point. By using session-frequency groups, we could remove some of the interpretive static introduced by a pure linear model for session attendance. Our analyses show a declining impact of attendance after many sessions, and common sense would lead us to expect such a
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Institutional Assessment of a Genre-Analysis Approach to Writing Center Consultations • 41 decline. A student who attends sessions two or more times a week is likely a student with documented special needs or, alternatively, a highly engaged, highachieving student. In both cases, one would expect the overall impact of each session after 10 or 20 sessions in one semester to be minimal. Using our assessment, we persuaded institutional administrators and the Board of Trustees that the RWS tutoring in place at this institution contributes to students’ overall semesterlong success, measured in terms of semester GPA, even after we controlled for known confounding variables, such as race, placement testing scores, gender, and time in school. It is noteworthy that a significant, substantial association was found between exposure to RWS tutoring sessions and semester GPA, which is, after all, an aggregate measure averaged across individual courses, rather than the grades on specific texts produced or grades in writing-intensive courses. The mechanism that affects changes in GPA is still poorly understood. It is simply possible that reading and writing abilities have quite general and substantial applicability across courses, and that work on reading and writing in the RWS affected those outcomes. A second interpretation is that the RWS consultation protocol, which emphasizes genre analysis, metacognitive reflection, and self-regulated learning, helps students adapt and transfer their learning across their courses, which is the intention of the design. The data and design of the current research, however, is insufficient to further analyze the mechanisms that affected GPA increase. In the future, we hope to garner support for a longitudinal study of students’ attitudes and expectations about reading and writing in their college courses and how those attitudes and expectations evolve through their encounters with the protocol. We hope such a study will teach us more about how students perceive and apply the strategies we teach in the RWS and to understand more specifically the mechanism by which students’ “success” was improved. For our pragmatic purposes, the findings were persuasive enough to secure permanent funding for the RWS and began an institutional conversation about genre analysis and writing in the disciplines that had previously remained underground, if present. This precipitated additional invitations to our writing center tutors to facilitate writing workshops in courses such as International Business, Occupational Therapy, and our Early College initiative. Additionally, some faculty members in our communications and writing department sought us out for examples of capable scientific writing and analysis to include in their firstyear courses. Moreover, some of our high-ranking
institutional administrators became vocal advocates— not only for the use of the RWS, but also for the consultation protocols we had in place. In fact, at one institution-wide meeting, the Provost publicly articulated that the RWS mission and methods was neither to proofread nor to improve grammar directly, but to serve students through grounded learning methods. We recognize our assessment’s limitations. First, the control group of students—those who signed up for tutoring but did not attend—is very small. Because most of our registrations occurred during class visits, there is reason to believe that these students were similar in many respects to the other students in the study, and we have controlled for the factors we were able to identify and categorize. While the impact of exposure to RWS sessions may contain a bias due to uncontrolled confounding, we have controlled for several demographic and other variables known to be associated with academic performance in our statistical analysis. Hence, we believe we have made a substantial effort to clearly identify the impact of RWS sessions on overall success in courses, as measured by semester GPA. Motivation is one such variable. Research conducted in a separate study at our college explored relationships between non-cognitive skills and semester GPA using scores from a pilot using SuccessNavigator software. There are three variables in the SuccessNavigator (SN) non-cognitive assessment instrument that could be viewed as reasonable proxy measures of student motivation. That analysis showed that the impact of these proxy measures for motivation was captured by the demographic covariates included in our analysis. If the SN proxy is a valid measure for student motivation, then such findings suggest that our analysis indirectly models motivation through our demographic covariates and thus student motivation might not be a substantial confounder. Ultimately, we will only be able to identify how the variable we so often call “student motivation” affects the impact of tutoring when we develop a deeper, more complex grounding of the behaviors we observe and associate with motivation.
Concluding Remarks We do not offer this summary of experiences and findings as a way to resolve the current tensions experienced by writing center administrators in the era of accountability funding but rather to persuade other colleagues to find ways to capitalize on these tensions as opportunities. We call on our colleagues to explain to our institutional interlocutors what we do and why
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Institutional Assessment of a Genre-Analysis Approach to Writing Center Consultations • 42 we do it. And we encourage them to draw on scholarship from within and outside our disciplines to advocate for pedagogical principles when developing assessments. This kind of institutional and disciplinary translational work is essential to the sustaining validity and value of our disciplinary expertise. In this vein, we argue that accountability funding initiatives frequently contain objectives that many writing professionals also advance. For example, we can agree across our disciplines and institutions on a commitment to supporting historically underrepresented students to complete their educational goals. By working within our rhetorical commonplaces—student inclusion and support—we believe writing instruction professionals can further and more persuasively advocate for our intersecting professional knowledges and ideals within this contested era of education. We began this article with recognition of the fraught positions that writing center administrators occupy at the intersection between institutionallydriven outcomes assessment, education-policy initiatives, and the student-centered, process-driven values professed by a diverse family of writing studies specialists. We have aimed to achieve two things through the description of our analysis and results. First, we have aimed to demonstrate a path by which writing centers can advocate for best practices while also committing themselves to internal and external assessment regarding institutional “student success.” Second, we have aimed to provide a model for enacting that assessment in a way that is multi-faceted and responsive to both institutional and individual student needs. We recognize that our model is but one response to the perceived gaps in assessment approaches. It is our hope that we have provided a useful and trustworthy model that can productively motivate new conversations between writing center initiatives and college-wide assessments at other institutions, while also helping writing centers argue for sound pedagogies. We look forward to the publication of this special issue of Praxis and towards reading other articles in the special issue that can foster our continued growth towards an even more progressive and reflective model, explicitly for two-year college writing center work. As we close, we want to reiterate the calls for continued research on writing center assessment and for increased communication between writing studies and institutional administrators about the processes, rationales, and implications of institutional assessment at all levels of education, so that disciplinary scholarship and higher education policy can enjoy a more generative collaboration that— in the end—will best support students’ learning.
Notes 1. See, for example, Williams; Hourigan; Nystrand; Nystrand et al. 2. See, for example, Bawarshi; Rounsaville et al.; Reiff and Bawarshi. 3. See, for example, Babcock and Thonus; Hallman et al. 4. Discussions of “White’s Law” can be found in Bloom et al.; Norbert and Perelman; Haswell and Wyche-Smith. 5. The students in the “no tutoring sessions” group were students who registered with the RWS but did not attend any tutoring sessions. 6. https://form.jotform.com/70263719023148 7. The observed, unadjusted difference serves as the unadjusted effect-size estimate. The parameter estimates of the coefficients representing the exposure to RWS services variables in the GLM models are the adjusted effect-size estimates. Works Cited Babcock, Rebecca Day and Terese Thonus. Researching the Writing Center: Towards an Evidence-Based Practice. Peter Lang, 2012. Bawarshi, Anis S. Genre and the Invention of the Writer: Reconsidering the Place of Invention in Composition. Utah State UP, 2003. Google Scholar, http://digitalcommons.usu.edu/usupress_pubs/1 41/. Bloom, Lynn Z., et al. “Symposium: What Is College English?” College English, vol. 75, no. 4, Mar 2013, pp. 425–430. Council of Writing Program Administrators, et al. Framework for Success in Postsecondary Writing. Jan 2011, http://wpacouncil.org/files/framework-forsuccess-postsecondary-writing.pdf. Driscoll, Dana Lynn, and Sherry Wynn Perdue. “RAD Research as a Framework for Writing Center Inquiry: Survey and Interview Data on Writing Center Administrators’ Beliefs about Research and Research Practices.” The Writing Center Journal, vol. 34, no. 1, Fall/Winter 2015, pp. 105–133. ---. “Theory, Lore, and More: An Analysis of RAD Research in ‘The Writing Center Journal,’ 19802009.” The Writing Center Journal, vol. 32, no. 2, 2012, pp. 11–39. Hallman, Rebecca, et al. “Re(Focusing) Qualitative Methods for Writing Center Research.” The Peer Review, vol. 1, no. 0, n.d. http://thepeerreviewiwca.org/issues/issue-0/refocusing-qualitativemethods-for-writing-center-research/ Haswell, Richard, and Susan Wyche-Smith.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Institutional Assessment of a Genre-Analysis Approach to Writing Center Consultations • 43 “Adventuring into Writing Assessment.” College Composition and Communication, vol. 45, no. 2, May 1994, pp. 220–236. Hourigan, Maureen M. Literacy as Social Exchange: Intersections of Class, Gender, and Culture. SUNY P, 1994. Google Scholar, https://books.google.com/books?hl=en&lr=&id =EE9mlAg7ygC&oi=fnd&pg=PR9&dq=%22why+john ny+can%27t+write%22+and+composition+studi es&ots=ZoPAwVQ6P_&sig=FE6fYafIQi0EHUg Dsqn3eZfzPN8. Lerner, Neal. “Choosing Beans Wisely.” The Writing Lab Newsletter, vol. 26, no. 1, Sep 2001, pp. 1–5. ---. “Counting Beans and Making Beans Count.” The Writing Lab Newsletter. vol. 22, no. 1, Sep 1997, pp. 1–4. National Council of Teachers of English. NCTE Position Statement on Reading. Position Statement, Feb 1999. Norbert, Elliott, and Perelman, Les, editors. Writing Assessment in the 21st Century: Essays in Honor of Edward M. White. Hampton P, 2012. North, Stephen M. “The Idea of a Writing Center.” College English, vol. 46, no. 5, Sep 1984, pp. 433– 446. Nystrand, Martin. “The Social and Historical Context for Writing Research.” Handbook of Writing Research, edited by Charles A. MacArthur et al., Guilford P, 2006, pp. 11–27. Nystrand, Martin, et al. “Where Did Composition Studies Come from? An Intellectual History.” Written Communication, vol. 10, no. 3, 1993, pp. 267–333. Pennington, Jill, and Clint Gardner. “Position Statement on Two-Year College Writing Centers.” Teaching English in the Two-Year College, vol. 33, no. 3, 2006, pp. 260–263. Reiff, Mary Jo, and Anis Bawarshi. “Tracing Discursive Resources: How Students Use Prior Genre Knowledge to Negotiate New Writing Contexts in First-Year Composition.” Written Communication, vol. 28, no. 3, 2011, pp. 312–337. Rounsaville, Angela, et al. “From Incomes to Outcomes: FYW Students’ Prior Genre Knowledge, Meta-Cognition, and the Question of Transfer.” WPA: Writing Program Administration, vol. 32, no. 1, Fall 2008, pp. 97–112. Toth, Christina M., et al. “ ‘Distinct and Significant’: Professional Identities of Two-Year College English Faculty.” College Composition and Communication, vol. 65, no. 1, Sep 2013, p. 90. Williams, Bronwyn T. “Why Johnny Can Never, Ever
Read: The Perpetual Literacy Crisis and Student Identity.” Journal of Adolescent & Adult Literacy, vol. 51, no. 2, Oct 2007, pp. 178–182.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Institutional Assessment of a Genre-Analysis Approach to Writing Center Consultations • 44
Appendix 1: Summary of Qualitative Assessment Protocol Findings Student Assessments: We used WCOnline’s survey feature to assess students’ experiences at the RWS, their self-described levels of confidence to perform the task at hand, and summaries of next steps in their own words to assess the use of the RWS and the impact on students’ self-efficacy. In short, students described confidence and self-efficacy (e.g., “I believe I can carry out the revision steps I’ve outlined”), which suggests high intrinsic reward (95% and above). Analysis of students’ “next steps” is ongoing, and we hope to address this methodology in later publications. Faculty Assessments: We used the National Council of Teachers of English (NCTE) position statements on Writing Centers at Two-Year Colleges and on Reading to form the foundations of our assessment form (Pennington and Gardner; NCTE).6 We adapted additional questions about faculty perceptions of impact based on input from our Faculty Advisory Board. We selected the criteria outlined in these position statements because we wanted faculty to first understand the purpose and methods of writing center instruction to evaluate our operation against those standards. We also wanted faculty to critically engage our practices and to provide productive feedback about how we could better meet those outcomes. Thus, we aimed to eschew conversations about proofreading and editing by framing the conversation in terms of “industry standards,” a term our college uses and that faculty members support. In addition to conducting informational outreach with faculty and conducting grounded outreach and support for new faculty who assign writing in their classes, we also exported our philosophies in the form of teaching support “playing cards,” which we handed out during professional development events, one-onone conversations, and student-support-team collaborations. (A PDF of our faculty outreach playing cards can be found here: http://bit.ly/2jrC3V5) Overall, survey responses demonstrated that our outreach efforts have remained incomplete. Those who had been exposed to our outreach through faculty orientation or class visits and those who had visited the studios indicated that we met or exceeded the criteria adapted from the NCTE position statements. However, roughly half of our respondents indicated they “did not know” about several sessions and could not evaluate the procedures we have in place for training or for tutoring. Our efforts to improve outreach to faculty and to include their evaluation in our assessment are ongoing. Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Institutional Assessment of a Genre-Analysis Approach to Writing Center Consultations • 45
Appendix 2 Figure 1: Consultation Hierarchy
Figure 2: Comparison of Effect-Size Estimates for Linear and Session Frequency Groups
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Institutional Assessment of a Genre-Analysis Approach to Writing Center Consultations â&#x20AC;˘ 46 ! Table 1: Analysis Parameters Key Predictor Parameterizations
Assessed Impact and Analytical Method (1) GPA (multiple regression)
(1) Number of sessions attended
(2) Pass/ Fail (logistic regression) (1) GPA (multiple regression)
(2) Session frequency grouping (0, 1, 2, 3-5, 6 or more)
(2) Pass/Fail (logistic regression)
Table 2: Means and Effect-Size Estimates for Semester GPA Outcome7 Number of Sessions
Mean Semester GPA
Adjusted Difference (from no sessions group) --
95% Confidence Interval
p-value
2.26
Unadjusted Difference (from no sessions group) --
No sessions (N=34) 1 session (N=226) 2 sessions (N=98) 3 to 5 sessions (N=125) 6 or more sessions (N=73)
--
--
2.75
+0.49
+0.33
-0.02, +0.67
0.061
2.92
+0.66
+0.45
+0.08, +0.82
0.019
2.88
+0.62
+0.78
+0.42, +1.14
< 0.001
3.05
+0.79
+0.79
+0.40, +1.18
< 0.001
Table 3: Effect-Size Estimates for Pass/Fail Outcome Number of Sessions No sessions (N=34) 1 session (N=226) 2 sessions (N=98) 3 to 5 sessions (N=125) 6 or more sessions (N=73)
Unadjusted OR (no sessions group as reference) 1.00
Adjusted OR (no sessions group as reference) 1.00
95% Confidence Interval
Significance Level
--
--
2.40 2.94 3.27
1.88 2.06 6.44
0.80, 4.40 0.78, 5.44 2.43, 17.08
0.14 0.15 < 0.001
6.21
7.57
2.45, 23.42
< 0.001
Praxis: A Writing Center Journal â&#x20AC;˘ Vol 15, No 1 (2017) www.praxisuwc.com!
Praxis: A Writing Center Journal • Vol 15, No 1 (2017)
OUR STUDENTS CAN DO THAT: PEER WRITING TUTORS AT THE TWO YEAR COLLEGE Clint Gardner Salt Lake Community College clint.gardner@slcc.edu Abstract Because of the author’s experience hearing from other writing center professionals at community colleges that community college students are not capable of serving as peer tutors, as well as survey data demonstrating that community colleges do not hire peer tutors at the same rate as other institutions of higher learning, the author conducted exit interviews of peer tutors at Salt Lake Community College in order to determine what peer tutors learn from their work experiences in a community college writing center. The purpose of the study was to establish what peer tutors learn, in order to correlate not simply what they take away from their experience, but also to substantiate that peer tutors can indeed help the writers they work with to learn. Since the results of this analysis were broad and represented a wide variety of concepts that are learned by peer tutors, the author designed a more specific survey to explore what they learned about writing and being a writer. The resulting data lead the author to conclude that peer tutors learn much from their work experience, allaying concerns that community college students are not capable of serving as peer tutors.
Author’s note: This essay is based upon two keynote addresses: one at the South Central Writing Centers Association Conference in Corpus Christi in 2013, and the other at the Transitioning to College Writing Conference hosted by the University of Mississippi in 2015. I have also presented the results of interviews at various conferences including CCCC, IWCA, and NCPTW. Back in the 90s when I was pretty new to my writing center directing career, I was presenting at our local Two-Year College Association Conference on how we at Salt Lake Community College (SLCC) had developed a peer tutoring program. It was a standard how-to-writing-center presentation and I discussed all the pertinent issues: dealing with staffing turnover; recruiting tutors from varied backgrounds—including students who struggled with writing in the past, preparing peer tutors to work with the wide variety of writers we see, etc. You can imagine, then, during the question and answer period, that I was rather taken aback when my whole presentation and premise for hiring peer tutors was dismissed by a writing center colleague from another community college. “Our students could never do that,” she said. In my mind’s eye, I have the person storming out of my session, but in reality I think she just sat down while I hemmed and hawed for a response.
This colleague’s community college was no different from most other community colleges I’ve seen: more racially and ethnically diverse than other types of higher ed institutions, with students from a wide range of ages and socio-economic backgrounds, more veterans, more returning students, and more refugees (AACC 2). In other words, that community college, like most others, demographically reflected the community that it served. As George Vaughan from the Academy for Community College Leadership and Advancement, Innovation, and Modeling stated, “If one wants to understand who attends a community college... stand on a busy street corner, and watch people go by” (19). While I have no idea what my reply to this particular colleague was—I don’t think I swore excessively—the person’s indignation at my hiring peer tutors is burned into my memory and has been a prime motivator for me to study the efficacy of peer tutors at community colleges, what impact they have on the institution as a whole, and, most importantly, what impact their work has on their education and lives. This demonstration of negativity towards peer tutoring at community colleges is not unique, given that I’ve encountered it both more and less blatantly in discussions with colleagues over the years. The bias is borne out by the data from the Writing Centers Research Project (WCRP) 2001-2008, which indicates that two-year colleges have peer tutors work in their centers at a lower rate than other higher educational institution types with 126 responses out of 260 (47%) for two-year colleges, versus 260 to 306 (85%) at four year institutions (“Raw Survey Data from Previous Years”). The current WCRP data from 2014-2015 indicates a moderate growth in the number of peer tutors at two-year colleges; the percentage is now at 58% (n=38), but we still lag behind four-year institutions which are now at 95% (n=64). The lower number of responses from the current WCRP must be taken into consideration when evaluating whether or not there is an upward trend in peer tutoring at twoyear colleges. Since I’ve not yet surveyed two-year college writing center administrators, I can only make
Our Students Can Do That • 48 informed guesses about the reasoning behind why two-year colleges don’t avail themselves of peer tutors: peer tutors are ill-prepared and cannot talk about writing; having them do so would be having the ignorant teaching the ignorant. In other words, the “our students could never do that” attitude flies directly in the face of the early peer tutoring theory of Kenneth Bruffee, who championed peer response as an effective teaching model, and addressed the issue of the “blind leading the blind” by stating that One answer to this question is that while neither peer tutors nor their tutees may alone be masters of the normal discourse of a given knowledge community, by working together— pooling their resources—they are very likely to be able to master it if their conversation is structured indirectly by the task of problem that a member of that community (the teacher) provides. (9-10) I am not going to address the bigotry behind such negative beliefs, as that seems counterproductive, and I would rather focus on positive outcomes. I also won’t be focusing on whether or not peer tutors can give effective feedback. That topic is definitely worthy of further study but I am instead going to explore the impact that writing center work has on peer tutors; how they learn about writing; how they, in turn, can pass that knowledge along to the students they work with; and what they learn from their fellow students. I will explore how peer tutors demonstrate that they can do this work, and that it does have an important impact on their abilities as writers and their lives as human beings. Being a peer tutor is a reciprocal educational experience in and of itself. There are a growing number of studies that explore what peer tutors learn about writing—a specific claim that Bruffee made when advocating peer tutoring in writing (“‘Conversation of Mankind’”). One study that explores Bruffee’s assertion is the Peer Writing Tutor Alumni Research Project (PWTARP). That project was developed by Brad Hughes, Paula Gillespie, and Harvey Kail in order to better understand the effects that peer tutor work in a writing center environment has on students. Hughes, Gillespie, and Kail have found significant academic and career trajectory outcomes for peer tutors who have worked in their institutions, as they report in their 2010 article from Writing Center Journal, “What They Take with Them: Findings from the Peer Writing Tutor Alumni Research Project.” The project asks alumni peer tutors to respond to a written survey. The PWTARP project focuses on impact on learning (particularly in learning about writing), as well as career and education path.
For several years now, I have been tracking the careers of peer tutors (formally, Peer Writing Advisors) who worked at the Salt Lake Community College Student Writing Center—loosely based upon the guidelines put forth by the PWTARP. My spin on the project was to record exit interviews to probe the depths of the impact the writing center had on their education and their lives. (See the Appendix for the complete set of interview questions.) I have recorded fifteen exit interviews since 2007 and have been the sole interviewer, transcriber, and researcher. All tutors who have stopped working at the Student Writing Center are asked to be interviewed, but seven were unavailable for it. When I initially started conducting the interviews, Salt Lake Community College did not have an institutional review board (IRB). SLCC has since instituted an IRB, and I have obtained IRB approval for further research. Overall, I believe the recordings make a better connection with the interview subjects than written responses and are certainly more evocative than text. I’ve used them, in fact, to show administrators the importance of peer tutoring to our institution. My exit interviews echo the findings of foundational studies that “peer tutors help themselves increase their own understanding of the subject matter they tutor students in/on, which boosts confidence and can carry over to their desire to learn other subjects” (Ehly et al. 21). While I have no evidence that these tutors “desire to learn other subjects,” it is clear that they have expanded their ideas about people and the world at large. Likewise, the PWTARP supports these conclusions, based upon formalized surveys of what alumni peer tutors say they have learned in their work in writing centers. PWTARP identifies the following topics that the respondents state they developed: a new relationship with writing; analytical power; a listening presence; skills, values, and abilities vital in their professions; skills, values, and abilities vital in families and in relationships; earned confidence in themselves; and a deeper understanding of and commitment to collaborative learning. (14) The exit interviews I conducted represent all of these developments, except for the application in their profession, since they have not started their profession at the time of their exit interview. Since I teach writing, I was initially more interested in what peer tutors learned about writing and rhetoric rather than other outcomes. Over the course of conducting the interviews, however, my perspective changed. In accordance with the IRB guidelines in place at Salt Lake Community College, I have changed the
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Our Students Can Do That • 49 names of the five peer tutors to capital letters (T, C, K, N, and F) in order to protect their privacy. CG indicates me. One of my first interviews from 2007 was with T, a female tutor in her early twenties who worked in the Student Writing Center for approximately two years where she conducted 462 sessions with approximately 354 different students: CG: What are the most significant abilities, values, or skills you developed as a Peer Writing Advisor? T: Empathy, patience, and the ability to break down my own language for others, and the ability to pick apart my own writing because of what other students have written—seeing patterns that I use in their writing, and being able to look at it subjectively—objectively? [Eyeroll upwards as if questioning which word to use.] CG: So do you think, in some sense, that it has helped you improve as a writer working with other folks even though they may be more struggling writers let’s say? T: Yes, because I started at that base. That’s one of the things I think I was—one of the strongest things I was hired for is because I wasn’t a natural at this, I had to work at it. I know where they’re at. I know how that feels. CG: You can connect. You can connect. T: Yup. In retrospect, I see that T’s statement is thorough, but it wasn’t the answer I wanted to hear. I then re-asked the question, as if learning empathy wasn’t an important thing to learn or related to writing. CG: Um. What do you think you learned most—aside from empathy—which is a good thing to learn! [laughs]—is there anything specific you learned in responding to people and their writing or teaching or something like that? T: Different learning styles. You need to be able to cater to each one; if you go on a different track either they’re not going to learn or they are going to use you as a crutch as they’ve used other people. Using their learning style to their ability gives them a chance to take responsibility of their own academic progress. Nevertheless, despite what I thought at the time I interviewed T, she is indeed talking about writing and what she learned about writing. As she said, she learned how to read her own writing through the lens of those writers she worked with, and respond accordingly. Because, as she claims, she wasn’t a
“natural” at writing, she had to pay better attention to her own work and to the work of others. T is talking about a complex set of activities and language use: she learned to apply analytical principles to her writing or, in her words, to “break down my own language” and “pick apart my own writing” through “seeing patterns that I use” in the writing of others. When I interviewed T, my view of tutors’ work and what they learned was parochial at best, and completely focused on a very narrow view of what writing is and how people learned to write. Peer tutoring does, indeed, give something more to the tutor than just learning about writing. Peer tutors do learn empathy. Peer tutors learn about the mechanics of learning and how to accommodate varied learning styles. Mostly, of course, they learn about working with other people—people they may have never even considered working with before. As Brian Fallon stated in his 2011 National Conference on Peer Tutoring in Writing keynote address, “Peer tutors... teach all of us how to meet our students where they are, how to celebrate in that space, and how to be open to learning from moments that present great challenges” (362). Furthermore, in his 2010 dissertation The Perceived, Conceived, and Lived Experiences of 21st Century Peer Writing Tutors, Fallon challenges the field to go back to those original conceptions of peer tutoring, to rethink Bruffee, Harris, Trimbur, Hawkins, and Kail, and to think about their early work in terms of the lived experiences of present-day tutors. Writing center scholars have done their work when it comes to the perceived and conceived experiences of tutors, but it is time to fold a new voice into the debate by including peer tutors more substantially in our professional communities of practice. By seeing our field through the eyes of peer tutors, we stand a better chance of understanding the future contributions of peer tutoring to teaching and learning. (235236) For Fallon, the perceived and conceived experiences of peer tutors are what we directors (or theorists) place upon them, rather than the tutors’ lived experiences that we can only find out through talking with them, and not filtering that conversation through our own perceptions and conceptions (Perceived 205-217). My narrow conception of peer tutoring as “only about writing” or “learning to write” is far too reductive; as Fallon describes, I found that I needed to be more open to other types of learning taking place—learning based on tutors’ lived experiences. Learning about
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Our Students Can Do That • 50 other concepts is represented in many of my interviews, such as with C and K. C, a male in his mid-twenties, worked in the Student Writing Center for approximately three years where he conducted 1,403 tutoring sessions with 955 students. He confirms that working with writers helps one put their own writing and learning in perspective: C: I think one of the most valuable things was to see writing as a process. I think it was particularly when I started working with students—working with them on their papers and trying to help them see how they could improve their papers, I started to realize that I had to give myself the patience with my own writing, and give myself time to do it and time to write multiple drafts, and see it as a process instead of a product model. K, a male in his early twenties, worked in the Student Writing Center for approximately eighteen months where he conducted 260 sessions with 195 different writers: K: Well, I think by helping other people write and learn to be better writers, in the process I definitely gained writing skills, and so reaped the benefits of that. You know everything from planning and outlining, I’ve realized that by helping other people plan and outline their papers how crucial it is in writing a paper... CG: Anything else? K: Well, research, I’ve learned to evaluate the credibility of sources. So that’s really important. Other interviewees highlight the various skills they picked up, much like T, who identified empathy as a key lesson from her time as a tutor. N, a mid-twenties male, worked in the Student Writing Center for approximately two and a half years where he conducted 724 sessions with 507 writers: N: I think people skills, obviously. You know when I would first would meet with people I would be really nervous—I would shake or sometimes I’d stutter. I remember one of the first things [a colleague] said ‘I don’t know if I can do this.’” And I was thinking the same thing! But we both decided to—uh—just—we trusted you and we trusted your confidence in us. Like T, N was a student who had struggled during his high school education because of learning disabilities, and he admitted that he lacked confidence in his own abilities as a student. His ability to overcome his own apprehension is certainly something that one can note as successful. In his tutoring evaluations, students regularly commented on how N regarded them with
! respect and evinced concern for their learning and performance as a student. N, like T before him, was able to take his apprehensions and learn to not just overcome them, but to use them as a way to connect with students. Finally, F, a male in his early twenties, worked in the Student Writing Center for one year and conducted 219 tutoring sessions with 173 different writers: F: I think just being able to have the opportunity to talk with people from so many different cultures and so many different languages, has... helped me become globalized. You know what I mean? It just helps me see everybody more as like one big community. And just helping to see people, as people in need or somebody’s individual strengths rather than any sort of racial barriers. It has really helped to break any notions of that down for me, and I really value you that a lot. F identifies a development of increased understanding about people from different backgrounds, as well as hinting at the idea that he learned tolerance through working as a writing tutor. F admitted to me at one point that his upbringing was what he called “sheltered,” and that he was home-schooled. I do not know the extent of that sheltering, but F seems to believe that his exposure to others while tutoring has brought him increased, sustained contact with people from different ethnic and racial backgrounds from his own. Even though the interviews I have conducted show that that peer tutoring has a far wider impact on tutors than just what they learn about writing, I was still compelled to determine if the basic claims made about peer tutoring (that it helps to improve the tutor’s writing) could be measured, and how such learning about writing is demonstrated. Exit interviews were slowing the pace of the study because of low tutor turnover rate, so I decided to speed things up by conducting a focused survey of the current peer tutoring staff at the Salt Lake Community College Student Writing Center (see Appendix) with a total of seventeen responses. In the survey, I ask “What have you learned about writing from working with the writing of others?” One respondent explains that seeing the mistakes that other writers made gave her a stronger sense of basic writing principles: “I have learned how important each part of the writing process is and how easily any part can be missed if the writing project is rushed. The importance of a clear thesis statement, good topic sentences, logical organization of evidence, and effective conclusion has been
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
Our Students Can Do That • 51 reinforced for me. Mostly I have learned 2 things: 1) focused and sufficient research is essential for creating an interesting and convincing paper, and 2) research and writing must take place within the context of questions and critical thinking.” Another tutor learned about style: “My knowledge of mechanics has improved as has my awareness of options in writing style. I also feel it has improved my ability to self-edit my work.” And another learned about grammar: “Mostly techniques but it has also inspired me to learn more about grammar because even though I know how to write ‘well’ because I know when to write or not write certain things somewhere along the way it became more automatic and less cause and effect. So I would have sessions where students would ask me why we do certain things grammatically and I’d have to look it up because I forgot to ask why and so now I've become more curious about the whys in writing.” As with T in the exit interviews, the context of writers working on writing—even struggling writers—helped these three respondents to see how writing can be crafted and improved. Tutors apply their current knowledge and reinforce it when they work with peers to find answers for themselves and make choices as writers. They seek out new knowledge from reliable sources when they don’t know the answers. Working as tutors also helped them to apply and explore notions of writing that they would otherwise only see within the context of their own work. Thus, a peer tutor demonstrates learning when she helps a writer to apply different styles or different grammatical structures, as well as when she works with a writer to make decisions about problems they encounter in their writing (Devet 125-126). Furthermore, learning respect for writers is demonstrated in how one respondent summed up tutoring writers: “I find that I am constantly learning about the art of writing. Students will bring work into the writing center that I am intrigued by. They are using their writing in way that I had never thought of or never thought was worth trying. By helping others, I am becoming more experimental and willing to try things out.” By experiencing another writer’s choices, this tutor developed a better sense of the choices she or he can make as a writer. The tutor shows an awareness that a student, no matter their perceived abilities, makes
choices as a writer and deserves respect as such. This particular respondent has learned what Wardle and Hughes call a great advantage: Tutors view their conferences not in terms of the idiosyncratic ‘deficits’ of individual writers (or particular demographics of writers) but in terms of processes of learning that challenge many individuals at many different stages of their academic careers. (178) Tiffany Rousculp, my colleague at SLCC, emphasizes the need to respect writers and what they bring to a writing center: [Community Writing Center (CWC)] staff tried to remain fully aware of the complexities that people brought with them into relationship with the CWC—ever unfolding webs of resources, needs, and desires. The people whom... the community college... wanted to “empower” were not deficient beings requiring our educational benevolence; as such, it was not the Community Writing Center’s role to lead people to “change;” rather, we need to respect them for who, what, and where they were at a particular moment. This realization steadily altered the way the CWC would relate to the community—from seeing ourselves as a source of salvific change toward what Ellen Cushman calls “deroutinization.” (54) As Rousculp notes, Cushman references sociologist Anthony Giddens’ definition of “routinization” as social constructs and structures that shape our behaviors and interactions. “Deroutinization” gives us pause allows us to move social change by disrupting the routine (Cushman 12-13). As Rousculp notes, tutors who worked in the CWC gained new perspectives on the writers they worked with. I am convinced that the same “deroutinization” of cultural perceptions of “disadvantaged” or “underprepared” students happens for all writing tutors who have learned to respect the people they work with. Finally, one respondent put it this way: “The writing process is messy. What works for one person doesn't always work for others. Also, it can be challenging for writers to recognize their own mistakes. Sometimes writers need others to explicitly point out what is not working or needs to be changed. Not to mention, writing is very personal and it can be hard to ask others for help, but it can also be liberating.” These anonymous written responses echo the findings of my recorded exit interviews when I asked
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Our Students Can Do That • 52 participants, “What are the most significant abilities, values, or skills that you developed in your work as a peer writing advisor?” Rather than it just being about writing and rhetoric, the respondents read that question broadly. They understand that there are significant abilities, values, and skills not just about writing but about the human being who is doing the writing. They don’t see their fellow students as helpless in their own education, or as victims of society, or, even worse, as culprits in their own failure. Peer tutors learn to respect the people they are working with as writers, learners, and human beings. Nevertheless, they do learn that writing is a social act, and that it is important for writers to share their work with others. They learn about the writing process, as old-fashioned a term as that may be these days. They learn about revision and ways to make that stage of their process more effective. They learn about genre, style, usage, and grammar. They also learn to think about their own writing in novel and productive ways. Working as a peer tutor in a community college gives the student a chance to take on this difficult yet invigorating work. They learn from the students they are working with—in the best traditions of peer tutoring. Having peer tutors work in a community college writing center is well worth any risk that people presume. They can apply the considerable amount that they already know and they can continue to learn while doing it. Thus, they become stronger writers and strong responders to writing. Becoming better responders ultimately improves their overall ability to communicate. There is little risk and a lot of reward for all participants in peer tutoring, but it is particularly rewarding for peer tutors. The PWTARP results emphasize the impact that working in a writing center as a peer tutor can have: When undergraduate writing tutors and fellows participate in challenging and sustained staff education, and when they interact closely with other student writers and with other peer tutors through our writing centers and writing fellows programs, they develop in profound ways both intellectually and academically. This developmental experience, play out in their tutor education and in their work as peer tutors and fellows, helps to shape and sometimes transform them personally, educationally, and professionally. (Hughes et al. 13) In emphasizing that we should pay attention to the lived experiences of peer tutors, Fallon extends the work of PWTARP: “The journeys that peer tutors must take to become effective doers are fascinating because
! they entail more than what writing center scholarship may imagine” (Perceived 187). Further, “What can be learned from PWTARP... is that peer tutoring fostered a kind of liberal education that penetrated the relationships these individuals had with everyone from co-workers to family members” (Fallon, Perceived 222). When we at two-year schools take on pessimistic attitudes that resemble “our students could never do that” or that they won’t be in the center long enough for it to matter, we are accepting the trite and misinformed perception of community college students as failures instead of real human beings with real potential. We are not offering them the respect they deserve in taking on the challenge of education. We are also, I fear, not respecting ourselves and the work that we do. We are falling into the trap of believing that students who attend community colleges are either victims of themselves or society, cannot take action that will effect change in their lives and their communities, and cannot decide for themselves whether or not such changes are needed. Our students can do this work. Our students do perform this work. Our students take more from it than we realize. Works Cited American Association of Community Colleges (AACC). 2016 Fact Sheet. February 2016, www.aacc.nche.edu/AboutCC/Documents/AAC CFactSheetsR2.pdf. Bruffee, Kenneth. “Peer Tutoring and the ‘Conversation of Mankind.” Writing Centers: Theory and Administration, edited by Gary A. Olson. NCTE Press, 1984, pp. 3-15. Cushman, Ellen. “The Rhetorician as an Agent of Social Change.” CCC vol. 47, no. 1, February 1996, pp. 7-28. Devet, Bonnie. “The Writing Center and Transfer of Learning: A Primer for Directors.” Writing Center Journal. vol. 35, no. 1, Fall/Winter 2015, pp. 119151. Ehly, Stewart W., and Stephen C. Larsen. Peer Tutoring for Individualized Instruction. Allyn and Bacon, 1980. Fallon, Brian J. The Perceived, Conceived, and Lived Experiences of 21st Century Peer Writing Tutors. Dissertation. Indiana University of Pennsylvania, 2010. ---. “Why My Best Teachers Are Peer Tutors” (2011 NCPTW Keynote Address). The Oxford Guide for Writing Tutors: Practice and Research. Edited by Lauren Fitzgerald and Melissa Ianetta. Oxford UP, 2016, pp. 356-364. Hughes, Bradley, Paula Gillespie, and Harvey Kail. “What They Take with Them: Findings from the
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
Our Students Can Do That • 53 Peer Writing Tutor Alumni Research Project.” The Writing Center Journal. vol. 30, no. 2, Fall/Winter 2010, pp. 12-46. “Raw Data from 2014-2105 Survey (Writing Centers Research Project).” Purdue OWL Research, owl.english.purdue.edu/research/survey. “Raw Survey Data from Previous Years (Writing Centers Research Project).”Purdue OWL Research, owl.english.purdue.edu/research/survey. Rousculp, Tiffany. Rhetoric of Respect: Recognizing Change at a Community Writing Center. NCTE Press, 2014. “The Peer Writing Tutor Alumni Research Project.” The Peer Writing Tutor Alumni Research Project, writing.wisc.edu/pwtarp/. Vaughan, George B. The Community College Story: A Tale of American Innovation. The American Association of Community Colleges, 1995.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Our Students Can Do That • 54
Appendix SLCC STUDENT WRITING CENTER EXIT INTERVIEW* 1) For archival purposes, please state your name, which locations you worked at, and how long you have been working for the Student Writing Center. 2) What additional education have you pursued or will you pursue after leaving SLCC? 3) What are your ultimate career goals? 4) What are the most significant abilities, values, or skills that you developed in your work as a peer writing advisor? 5) Describe your most positive experience from your work in the Writing Center. 6) How has your writing center training and experience shaped your development as a college student? 7) Anything you’d like to add? *Based on The Peer Tutor Alumni Project Survey (http://www.marquette.edu/writingcenter/PeerTutorAlumniPage.htm) “Writing and the Writing Consultant Survey” 1) To what extent do you think your own writing has been influenced by your experience as a writing consultant? 2) In reference to your answer to the first question, please explain how your writing has been influenced by working as a writing consultant. 3) What have you learned about writing from working with the writing of others? 4) How important do you think getting feedback on your writing is? 5) When you write something, how often do you get feedback from others? 6) How long have you worked as a writing consultant?
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Praxis: A Writing Center Journal • Vol 15, No 1 (2017)
FOCUSING ON THE BLIND SPOTS: RAD-BASED ASSESSMENT OF STUDENTS’ PERCEPTIONS OF A COMMUNITY COLLEGE WRITING CENTER Genie Giaimo The Ohio State University giaimo.13@osu.edu Abstract This longitudinal mixed-methods study assesses students’ perceptions of the writing center at a large (approximately 11,325 students) multi-campus two-year college. The survey was collaboratively designed, with faculty and student participation; it presents findings from 865 student respondents, collected by peer tutors-in-training. The study offers a baseline assessment (Fall 2014) of the writing center, prior to wide-sweeping changes in recruitment, staffing, and training models, as well as a postassessment (Fall 2015) analysis of the changes in student knowledge of the WC and its purpose. It also offers data on the trajectory of student development in relation to number of sessions attended. In 2014, students’ experiences at the writing center were inconsistent; the poorly articulated mission of the WC adversely affected students’ knowledge scores, and the center’s reliance on editoriallike feedback, given predominately by adjunct faculty, contributed to inconsistent reportage in perceived learning by attended sessions. Many of these trends, however, reversed in 2015. This paper seeks to demonstrate the important role that RAD research can play in evaluating student learning within writing center contexts and articulating how and at what moments, and under what conditions, learning and development occurs in the student-writing center relationship. It also offers a replicable experimental method that researchers at other institutions can adapt and apply to their own institutional contexts and programmatic needs.
Introduction There is incredible potential for community college writing centers to produce pertinent scholarship and to build research profiles based on replicable, aggregable, and data-supported (RAD) studies. As Richard Haswell defines it, “RAD scholarship is a best effort inquiry into the actualities of a situation, inquiry that is explicitly enough systematized in sampling, execution, and analysis to be replicated; exactly enough circumscribed to be extended; and factually enough supported to be verified . . . With RAD methodology, data do not just lie there; they are potentialized” (201). By identifying clearly characterized challenges and potential solutions, RAD studies can offer mechanistically specific data for positively influencing the development of writing centers at community colleges. The methodology is both universal and local; combining experimental rigor with questions that suit specific institutions and circumstances allows community college (CC) writing center administrators “to justify their programs and budgets to educational administrators and faculty across the disciplines who expect research-supported evidence” while also
supporting a pertinent “best practices” approach to assessment (Driscoll and Perdue 35). As Clint Gardner & Tiffany Rousculp suggest, community college writing centers “have specific challenges that a director must address . . . [that] relate to their two-year only status, open enrollment, and the mission to reach out into the community” (136). Additionally, peer tutors, who are absent from community college writing centers far more frequently than at four-year college writing centers, are, when present, a vital resource for acting as ambassadors to the student population and also for enacting a RAD research program. This study recruited peer-tutors-in-training to co-develop a survey on assessing student perceptions of the writing center, utilizing RAD methodologies. Our survey aimed to answer these questions: What do students know about the writing center? What makes them likely to be attracted to its services? Are students developing as they attend sessions, and, if so, at what point in the visitation cycle? The project was conceived in response to the findings and frustrations of students in the tutoring practicum and their desire to do better outreach and increase student use of the writing center’s services. Informal questioning and observing gave way to a more methodical and replicable experimental method: a survey with Likert scale, dichotomous (true/false, yes/no), scaled, and open-ended qualitative questions (see Appendix for survey). The survey utilized a mixed-methods approach that combined quantitative and qualitative questions. Qualitative responses help to place a narrative structure onto an otherwise numerically-driven research project and offer justification for wide-sweeping changes that an audience will, perhaps, innately understand and sympathize with more than a numbers-driven model of assessment and program implementation. Additionally, a mixed-methods approach can be coded and analyzed within a RAD framework; as Haswell suggests, “Numbers may assist but do not define RAD scholarship. That is why the definition avoids the term empirical, which has so often been used to set up false oppositions with terms ethnographic, qualitative, grounded, and naturalistic” (201). Even with adding qualitative questions to a survey, however, I want to stress here that quantitative data tends to attract upper
Focusing on the Blind Spots • 56 administration and external funding committees, as narrative-based research is often too lengthy for full review (Lerner 106). Also, it is extremely challenging to collect enough qualitative data to provide a representative yet also generalizable set of conclusions about a large and diverse population. Our research study aimed to assess the perceptions of the Bristol Community College (BCC) Writing Center within the general student population. The study occurred over two years, during which time large-scale policy and staffing changes were made. While the study’s development was influenced by the local context of the institution (open-access), and the manner in which peer tutors were trained (via a required course), many of its features can be replicated at other research sites. The following features are replicable: 1. This study assessed the perceptions of students who had attended and had not attended the writing center, previously. 2. This study recruited student researchers to disseminate the survey and code the survey results. 3. This study was conducted using pen and paper (rather than electronic surveys). 4. This study utilized multiple research sites, both within and outside of classroom spaces, and across the disciplines. Many of the study’s dissemination features were crafted with community colleges’ unique and heterogeneous populations in-mind. Aware that CC students have distinctive study, work, and living habits, as well as varying financial circumstances, the student researchers and I realized that many conventional surveying methods that four-year college writing centers tend to use, such as exit surveys and electronic surveys, would not accurately sample our student population or their knowledge of the BCC WC. Traditionally low WC attendance—by percentage of the overall student population—initially influenced our decision to recruit and survey students who had never attended alongside those who had. Lack of technology access—because of skill, age, or financial situation— affected why we chose pen and paper dissemination. Low student involvement with on-campus activities and time spent on campus (“car to classroom” habits) influenced our decision to survey both within and outside of a classroom setting. Thus, jettisoning our assumptions about the habits and motivations of the student populations that do and do not attend two-year college writing centers allowed us to capture rare data on an under-studied population and their writing confidence, writing knowledge, and knowledge of our writing center’s mission and support.
A main impetus for this project was the lack of research in our field on the habits, motivations, and perceptions of students who do not attend writing centers. Widely surveying a student population ought to be the “gold standard” of the field’s methodological approach to studying habits and perceptions within writing center contexts. Those that refuse to attend a writing center can act as a control group for researchers to compare with students who do attend the writing center. Another reason to include this population in writing center assessment is that our field is predicated on the claim that writing centers “help all writers at all levels;” whereas, in reality, colleges and universities are lucky if they see 30% of a student population. At BCC, the number was even lower, around 25%. If we fail to study the non-attendee alongside the attendee, we hazard losing a valuable demographic in our assessment. Of course, administrators also often argue that the number of students who come through a service’s doors is a measure of programmatic success, thus studying the habits and perceptions of those that do not attend a writing center might contribute to an increase in usage rates. To complicate this assertion, however, the findings from this study demonstrate that single-users and repeat-users of the BCC WC might have very different motivations for attending, and very different thresholds for improvement. The number of unique clients (proportionate to overall percentage of enrollment) might not be the best way to measure a service’s impact on student learning. In fact, measuring clients’ long-term engagement with writing center support results in far more robust data on student learning and confidence development and might be another key hallmark of effective writing center support. Without studying those who do not attend the writing center, it is difficult to know what, if any, effect it has on attendees.
Method Class description. English 262: “Tutoring in a Writing Center Practicum” met once a week for two hours and forty minutes over a fifteen-week semester in fall 2014 and fall 2015. In the first year, there were fourteen enrolled students; in the second year, there were eight enrolled students. The course had a mixture of honors and non-honors students, although it was listed as a course within the Commonwealth Honors Program. Initially, the course required the completion of common ethnographic activities, including: participating in a session at the writing center, observing a tutoring session, conducting a site evaluation, and tutoring a client while under observation. In writing centers that
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Focusing on the Blind Spots • 57 run smoothly, such activities should not prove to be a challenge for students to conduct. Unfortunately, during the fall 2014 semester, students struggled to complete these tasks in a timely manner. Although they did not realize it, some of the issues stemmed from previous mismanagement and lack of training which affected students’ ability to complete their ethnographic research. Clients did not show up for their appointments and, when they did, the sessions were over well in advance of the 45-minute allotment, because the tutors were directive and provided line edits, or because clients required to attend were resistant to engaging in the session. So, my students could not conduct complete observations and tutorials! Developing survey. From observing these systemic issues, my students concluded that if they struggled to utilize the writing centers’ services with anything akin to ease, other students were probably experiencing similar difficulties. They started asking me data-specific questions about the BCC Writing Center: Do students know that we have a writing center (or four, actually)? How many students attend the WC each semester? How many attend multiple times in a semester? Do students feel that the writing center helps them to develop as writers? Do students know what a writing center does? These questions, and dozens more, guided the development of our survey. Together, we brainstormed a number of questions that we felt would improve the brand, marketing, and quality of the BCC WC’s services. These questions developed into a survey that we then shared with our Institutional Review Board (IRB). In class, we discussed the history of human subjects testing (Stanford prison experiment, Milgram study, Tuskegee study), and how to conduct ethical and rigorous research. We workshopped our questions to alleviate selection bias and potential confounding variable, and we met with IRB to learn about how to conduct survey work methodically and sample respondents randomly. We also discussed the importance of statistical significance and the need to replicate, as closely as possible, the way in which we approached each potential survey participant utilizing tools such as a script and anonymous survey coding system. The resulting survey data informed the BCC WC’s marketing, training, and assessment plans; it also helped me, as director, to demonstrate effective best practices within a community college writing center, when most “best practices” are based on four-year college writing centers. Our coding system was reproduced on the back of the survey and included the surveyors’ initials, the date and time, the campus, and the specific location/class. We strategized first to hand out surveys in public spaces, such as the library, the cafeteria, the parking lot,
the study and computer lounges, and the corridors of academic buildings. After our initial survey dissemination in these public spaces, we then identified academic disciplines (including humanities, STEM, social sciences, business/management, and art) and randomly selected classes for students to survey in each discipline. As students learned to survey other students, we discussed the importance of random selection of candidates, both in class and in public spaces. Because successful students tend to follow similar patterns of behavior (another finding that our survey revealed), such as spending more time oncampus outside of class, it was imperative that students did not simply ask their friends to fill out the survey. In class, we discussed the importance of respectfully approaching strangers, and the need to sample a diverse group of students; that is to say, not to assume people were students or instructors based on their age, as 53% of BCC’s student population was over 21, and the average age of an enrolled student was approximately 29-years-old (“Bristol Community College Fact Sheet”). In all, we received 499 unique responses for 2014 and 366 unique responses for 2015, totaling 865 responses. Preparation of surveys. Our project was conducted anonymously and automatically coded numerically, when it was transferred from paper to electronic storage. The data was entered by multiple student researchers and initially stored in Google Docs, which is a free resource for aggregating data electronically after conducting large-scale paper survey work, although some institutions have preferences for how data is housed electronically. Qualtrics, RedCap, and Box are other programs where data may be securely collected and/or stored. All paper surveys were stored in a locked cabinet to which only I had access. Checking with IRB before housing large data sets is imperative—some colleges have very stringent rules on what types of data storage platforms to use, while other colleges might not have a stand-alone IRB. Similarly, some institutions cannot afford fee-based programs like Qualtrics or data storage sites like Box. Data management guidelines can vary widely from institution-to-institution. To avoid analytical bias, surveys with inconsistencies were removed based on four different criteria. First, surveys that had more than three blank fields were removed. Next, surveys in which responders claimed to attend sessions at the WC daily (the WC limited students to two sessions per week) were removed. Surveys that responded “no” to “I know where the BCC WC is located,” but also responded positively to the attendance question, were also removed. The final surveys eliminated were those
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Focusing on the Blind Spots • 58 that answered “no” to “I have been to the BCC WC” but also reported that the respondent goes once a semester or more often. Those that answered “yes” but reported that they never go to the BCC WC were included in the analysis, because they could either go less than once a semester (over the time they are enrolled at the college) or have attended in the past but never returned to the WC. Approximately 7% of responses each year were deemed incomplete based on the parameters identified. In a total of 810 surveys, 462 from 2014 and 348 from 2015 were included in final analysis. After surveys with multiple inconsistent responses were removed from the data set, a new variable, “WC knowledge,” was calculated using the three true-or-false questions—Questions 2, 3 and 4—and Question 6 (see Appendix), which were added together as a knowledge score. A correct answer was given a score of 1 and an incorrect answer was given a score of 0. Each student was assigned a knowledge score ranging from 0 to 4, where 0 means they answered none of the general WC knowledge questions correctly and 4 means they correctly answered all questions regarding WC general knowledge.1 Both the knowledge score and Question 11, about writer development, were analyzed using an ANOVA in R Studio (v.0.99.491).2
Results The effect of sweeping policy changes and the introduction of peer tutors to the staff was studied from the perspective of both general knowledge about the writing center, and students’ self-perceptions regarding their development as writers. In other words, we wanted to know if students’ knowledge scores improved from 2014 to 2015 and if students’ selfassessment of their development as writers changed from 2014 to 2015. Overall, the message of the WC was better communicated to clients after policy, training, and staffing changes were made post-Fall 2014. Students who attended the writing center in 2015 better understood the term edit and consistently reported that the WC helped them develop as writers.
Overall Knowledge of BCC Writing Center Policies and Locations However, the overall average knowledge score did not change between both years, remaining consistent at 2.45, demonstrating all students’ moderate knowledge of WC policies and locations (see Figure 1 in Appendix). The average student who goes to the WC at least once a semester, however, has a higher knowledge score for the practices and location of the WC compared with those who never use the BCC WC.
Students who do not report attending the writing center have moderate knowledge (2.22) of the BCC WCs’ policies and practices. In contrast, any student that attends a BCC Writing Center, regardless of the frequency with which they attend, has a cumulative score of approximately 2.90. We can further break down these averages to see where misinformation about the WC is most common. One major finding of this study is that all students seem to have good semantic knowledge of the BCC Writing Centers; most students know there is a writing center on every campus and they know where a WC is located. They are confused, however, in their procedural knowledge of the writing center, such as whether or not they can drop off papers and whether or not tutors edit their papers (see Figure 1 in Appendix). The highest overall knowledge scores are among those who attend the WC, and the most correct responses are regarding location and availability on all campuses. Those that attend are more likely to get these two questions correct (see Table 1 in Appendix). The only question within the knowledge score that changed between 2014 and 2015 was whether the WC edits students’ papers. In 2014, students’ procedural knowledge was incredibly limited and erroneously focused on the idea that tutors would edit their papers. We believe this is because of the directive tutoring model that professional and faculty tutors utilized in the WC up to that point. To shore up this claim, in 2014 more students who attended the writing center thought that tutors edit their papers compared with those students who never attended the WC, which points to how tutors and clients engage with writing insession. Confusion over the definition of “edit” might have also influenced responses. In 2015, however, after the introduction of non-directive tutoring models in training sessions, as well as the reintroduction of peer tutors, the pattern reversed; students who attended the WC were less likely to report that the WC edits their papers (see Figure 1 in Appendix; see Table 1 in Appendix for attendance by year interaction). Thus, the initial directive tutoring model profoundly affected students’ perceptions of the WC as an editorial site, rather than as a site that fosters meta-cognition in reading and writing through student-directed learning practices. These data suggest that knowledge about the writing center’s existence and locations are well advertised but there is a disconnect between the writing center’s policies and mission and students’ perceptions of what the writing center does; even among students who regularly utilize the writing center’s services. While in 2014, the policy response results were in line with the center’s older model of
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Focusing on the Blind Spots • 59 offering more directive feedback and editorial support via an all-faculty professional tutor staff, widespread changes to tutor training and staffing models in 2015 (such as the addition of peer tutors) contributed to an increase in the frequency of correct responses to the editing question. The BCC tutors were trained to no longer rely on directive feedback and editor personae in their sessions with students, which is reflected in the change in response, between both populations, to the editing question (as well as the coded qualitative responses, which are unpublished data).
Following incorporation of non-directive tutoring methods, students were more likely to report that the WC helped them develop as writers. While knowledge scores are important when considering messaging, mission, and performance of services, an even more important question to address is whether the WC is having an impact on students’ writing abilities. The flipside of positive impact is negative impact—whether or not students are dependent on the WC—which, in data, would show reportage of little to no development combined with an increase in writing confidence and/or an increase in attendance. To better understand how students utilize the BCC WC, and are affected by engagement with the WC, we asked them if they agreed that the WC helped them develop as writers. There was a stark difference between the years, with 2015 showing a stronger and more consistent relationship between number of visits to the writing center and positive responses to the writing development question (see Figure 2 in Appendix). Prior to the start of the Fall 2014 semester, and for a number of years, there was no consistent staff training. This is reflected in the inconsistency of students’ experiences with the writing center. Previously, we discussed how the former directive tutoring method shaped students’ perceptions of the WC as an editorial space. Student responses to the question about editing were then re-shaped by nondirective training initiatives and changes to the tutorstaffing model in 2015. This pattern is even clearer when comparing how students reported writing development by session attendance in 2014 and 2015 (see Figure 2 in Appendix). In 2014, there is a very weak relationship (R2=0.09) between how often a student attends the writing center and how much they feel they are developing as a writer; this is due to the large spread of individual responses (not shown). Comparing the means and slope (0.4), which is relatively flat, demonstrates that there is only marginal
improvement in development based upon how often a student attends the writing center. This could be due to inconsistent tutoring, or confusion about the policies and mission of the service, which might have led students to use the WC for a range of reasons, some passive (required to attend for class, expects line editing), some active (desire for feedback, expects to collaborate), thus leading to inconsistent student outcomes. Looking across the years, we see that how often a student attends the writing center is a strong indicator of how a student feels about the development of their writing process (see Table 2 in Appendix). There is also a significant interaction between the survey’s dissemination year and how often students attend the WC, meaning that there is a difference in the relationship between how often a student attends the WC and the improvement they feel that they are making. In 2014, students could attend 6 sessions and not report improvement in writing skill; whereas, in 2015, students attending far fewer sessions (3) reported improvement in their writing. 2015, then, was a far more consistent year, possibly due to the extensive training of the staff in Fall 2014 and Spring 2015 and the introduction of trained peer tutors. Now, there is a very strong relationship (R2=0.75)3 between how often a student attends the WC and how much they feel they have developed. Furthermore, comparing the means and slope (1.2), which is more acute than 2014, reveals that it takes students fewer attended sessions to report positive outcomes. Interestingly, students are ambivalent or negative about their development as writers for the first 2 appointments at the WC; however, from 3 appointments onward, students reliably report that the WC is having a positive impact on their development. Perhaps this gap in positive impact reportage for 2 or fewer sessions is due to student frustration with a novel (i.e. non-directive, student-led, meta-cognitively focused) learning model and a desire for clearer rules and guidelines that are in line with previous educational experiences. The 2015 results, however, are in stark contrast to 2014, where students could attend the WC 6 times, on average, and still strongly disagree with the assertion that the BCC WC is helping them develop as a writer. It is possible that a change between 2014 and 2015 in mission from remedial support to non-labelfocused and process-oriented support also affected students’ assessment of their development and writing confidence (unpublished data) (Mohr 2). Overall, the new training changed how tutors interacted with students—active learning models consistently lead to sessions that are more successful in that students feel that they are developing as writers.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Focusing on the Blind Spots • 60
Conclusions There are a number of factors to consider before implementing a longitudinal RAD study. The first concerns planning, the second concerns sample size, and the third concerns the long-term effects that result from such work. I address each of these, in turn, below. Implementing a longitudinal study on the same schedule as the tutoring course helped to frame an approach by establishing a consistent sampling period (fall semesters) and a consistent investigator pool (student researchers from the course). Even if a given CC’s WC does not have a tutor-training course, a similar project is still possible. Recruiting students from across the disciplines and in general education courses (such as introduction to psychology or statistics) to conduct similar survey work as part of an experiential learning component or final project for a class is another possible model. Alternatively, requesting that peer tutors engage in this survey work during “noshows” and other types of downtime in a writing center also makes this work feasible. In short, recruiting student researchers is crucial to conducting “on-the-ground” surveying and response coding; this model can also shape a research profile for a CC WC in a relatively short amount of time while building a community of practice around which student researchers and, later, peer tutors, feel empowered to conduct their own studies. However, it is imperative to ensure a large enough sample size for similar RAD projects; this is a direction that the field ought to move towards, similar to prioritizing replicability in survey work and surveying non-attendees alongside WC attendees. For community colleges, however, which have such varied population demographics that often do not align with four-year colleges’ WEIRD (Western, Educated, Industrialized, Rich, Democratic) student population, it is necessary for scholars in the field to account for such variations in their experimental design and adjustup sample size accordingly (Henrich et al.). The more variation there is in a population, the larger a survey’s sample size needs to be in order to detect differences between groups. The threshold to significance might be much smaller at schools that have less variation in student population demographics, and a more contained campus location. Still, it is imperative that we capture as wide a cross-section of student populations, as it offers a clearer picture of the challenges and needs of individual student groups within the general population. Surveying those who do and do not utilize writing center services is one such approach to achieving significance. The types of
students that attend community colleges are far from uniform, as their life experiences, socioeconomic statuses, ages, and other demographics suggest. Studying their perceptions of academic support for writing will certainly yield different results from traditional college students. Yet, researchers at two-year colleges are not necessarily constrained by a “uniqueness factor” that renders a study’s results inapplicable to other institutional contexts (Driscoll and Perdue, “RAD Research” 121). Instead, local replicable research projects are critical to revealing the challenges that two-year college writing centers (or writing centers at any institution) face and provide “evidenced based practices” to improve center efficacy and demonstrate what does and does not work (Driscoll and Perdue, “RAD Research” 122). Replicability hinges on keeping one’s investigator training, respondent recruitment, sampling methods, sample size, and coding methods consistent. Also, running multiple “trials” of a study helps to determine change over time, effect size, and what results are applicable externally; all of which systematically advances writing center research and assessment. I hope to instill in community college WC directors (and directors at all levels of higher education) a sense that conducting RAD research is not only tenable but also critical to our work in the profession. Implementation is an important aspect of the assessment process, as Joan Hawthorne suggests when she writes “Moving from collection of data to actual use requires attention to two aspects of the research assessment process: analysis of findings and implementation or ‘closing the loop’” (243). In addition to justifying our budgets, RAD research can provide granular data that two-year college writing center administrators can use to inform targeted marketing or specialized training and services, all of which is critical, given the very diverse and often time-strapped population of students attending twoyear colleges. At the macroscopic level, RAD research methods and replicability can provide a map for other institutions to conduct similar research. Perhaps the most attractive feature of this research method, however, is that it can determine whether students who do attend the writing center are learning and developing as writers. Perceptions about writing centers, ultimately, are present wherever writing centers are located, and it is often hard to change those perceptions because they can be so ingrained in an institution’s culture. Examining how perceptions influence a student’s decision to attend or not attend a writing center is complicated by the fact that most students are probably unaware of what motivates them
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Focusing on the Blind Spots • 61 in one direction or the other; unless, of course, they are required to attend! Our study revealed that students are more likely to attend the writing center if they have the following traits: moderate writing confidence, receptivity to learning new writing strategies, and moderate engagement with non-directive tutoring processes. Once students attend the writing center, they are more likely to continue attending if they report positive self-assessment of their development as writers. Studying the perceptions and attitudes of writers helps us to understand a decision that might seem spontaneous and happenstance; walking through a writing center’s doors. But to keep them coming back, it’s important for students to have positive experiences with writing and a belief that they are developing as writers; this, of course, takes time and patience on the part of the writer, as well as the tutor. Our finding that students’ positive self-reportage of writing development requires a fairly high threshold of engagement (3 or more sessions) with writing center services runs counter to the administration’s focus on unique student interactions—rather than continual student attendance—as a marker of success for our center. This finding also challenges another assertion I often heard (and assessed in the survey) that requiring students to attend a session for class credit makes them more likely to continue attending. When the threshold for positive development is 3 or more sessions, requiring students to attend just once potentially warns them away from the writing center. Findings that were not included in this paper (but will be forthcoming in future articles) further support that required attendance discourages a large percentage of students from using the writing center again. Thus, RAD research can confirm or disconfirm our hunches and hypotheses and provide us with much-needed evidence for or against our most cherished (or hated) practices. Acknowledgement Thank you Dr. Katherine R. O’Brien (OSU) for support with data analysis and for providing figures in R studio. Notes 1. All other coding and scoring metrics are not pertinent to this discussion; however, I am willing to share these methods with those interested in conducting a similar project in the future. 2. Data will be uploaded on Praxis Research Exchange. 3. Generally R2 values greater than 0.5 are considered significant when studying human behavior.
Works Cited “Bristol Community College Fact Sheet.” Bristol Community College, 2015. http://www.bristolcc.edu/media/bccwebsite/about/factsheets/FactSheet2016_left_cen tered.pdf. Accessed 1 May 2017. Driscoll, Dana Lynn, and Sherry Wynn Perdue. "RAD Research as a Framework for Writing Center Inquiry: Survey and Interview Data on Writing Center Administrators' Beliefs about Research and Research Practices." The Writing Center Journal, 2014, pp. 105-133. ---. "Theory, Lore, and More: An Analysis of RAD Research in "The Writing Center Journal," 1980– 2009." The Writing Center Journal vol. 32, no. 2, 2012, pp. 11-39. Gardner, Clinton, and Tiffany Rousculp. "Open Doors: The Community College Writing Center." The Writing Center Director’s Resource Book. Ed. Christina Murphy and Byron L. Stay. Mahwah, NJ: Lawrence Erlbaum Associates, Inc., 2006, pp. 135-145. Haswell, Richard H. "NCTE/CCCC’s Recent War on Scholarship." Written Communication vol. 22, no. 2, 2005, pp. 198-223. Accessed 1 Nov. 2016. Hawthorne, Jean. “Approaching Assessment as if It Matters.” The Writing Center Director’s Resource Book. Ed. Christina Murphy and Byron L. Stay. Mahwah, NJ: Lawrence Erlbaum Associates, Inc., 2006, pp. 237-248. Accessed 1 Nov. 2016. Henrich, Joseph, et al. "Most People are not WEIRD." Nature vol. 466, no. 7302, 2010, n.p. doi:10.1038/466029a. Accessed 1 Nov. 2016. Lerner, Neal. “Of Numbers and Stories: Quantitative and Qualitative Assessment Research in the Writing Center.” Building Writing Center Assessments that Matter. Edited by Ellen Schendel and William J. Macauley Jr., Boulder, CO: UP of Colorado, 2012, pp. 106-114. Accessed 1 Nov. 2016. Mohr, Ellen. "Marketing the Best Image of the Community College Writing Center." Writing Lab Newsletter vol. 31, no. 10, 2007, pp. 1-5. Accessed 1 Nov. 2016. Schendel, Ellen and William J. Macauley, Jr. Building Writing Center Assessments that Matter. Boulder, CO: UP of Colorado, 2012.
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Focusing on the Blind Spots • 62
Appendix
Figure 1: Summarizes the four knowledge questions and their response scores, as well as the aggregated knowledge score for all four questions (left). All responses are broken down by survey year and respondent type (WC attendees and non-attendees), illustrating that respondents had generally strong semantic knowledge of BCC WC’s existence and location; however, procedural knowledge about policies (editing and paper drop-off) was weak.
Table 1: Summarizes the results of five ANOVAs showing the effects of attendance and year of the survey on mean knowledge scores. This table illustrates that those students who attend the writing center, in either year, have more accurate overall knowledge scores and individual question scores for the questions regarding location, drop-off policy, and overall knowledge. However, there was a shift from 2014 to 2015, whereby students who attend the writing center are more likely to demonstrate correct knowledge about the writing center’s editing policy. Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!
!
Focusing on the Blind Spots â&#x20AC;˘ 63 !
Figure 2: Summarizes 2014 and 2015 student development as writers, as compared with number of sessions attended. From 2014 to 2015, the mean number of sessions attended decreased while studentsâ&#x20AC;&#x2122; self-assessment of their development as writers increased.
Table 2: Summarizes the results of an ANOVA showing effects of attendance and year on perceived development. This illustrates that the more students attend the writing center, in either year, the more likely they are to feel that the writing center has helped them to develop. Furthermore, the significant interaction of year, and how often a student attends, highlights the different relationship between attendance and perceived development in the two years. In 2014, there is a weaker relationship between how often a student attends and how much they feel that they are developing as a writer; however, by 2015, this relationship is much stronger. Thus, those who attend more are predictability likely to feel that they have developed as writers.
Praxis: A Writing Center Journal â&#x20AC;˘ Vol 15, No 1 (2017) www.praxisuwc.com!
Focusing on the Blind Spots • 64 !
Writing Center Survey Please take a moment to answer the following questions to help Bristol Community College’s Writing Center (BCC WC) improve its services. Circle the answer that is correct for you. 1. How often do you use a BCC Writing Center? Daily
Weekly
Monthly
Once a Semester
Never
2. There is a BCC Writing Center on each campus.
True
False
3. Students can drop off a paper to the BCC WC.
True
False
4. The BCC WC edits my paper.
True
False
5. I have been to the BCC WC.
Yes
No
6. I know where a BCC WC is located.
Yes
No
7. A BCC faculty member required me to attend the BCC WC.
Yes
No
8. Being required to attend the BCC WC benefitted me.
Yes
No
1-Stongly Agree 9. 10. 11. 12.
2-Agree
3-Neither Agree nor Disagree
4-Disagree
5-Strongly Disagree
I consider myself a strong writer (1-5): _____ I would like to learn new strategies for writing (1-5): _____ The BCC WC has helped me develop as a writer (1-5): _____ If you have used the Writing Center, please write two things that you learned:
13. Please list two or more services that you want the BCC WC to offer:
Thank you for your feedback! Surveyor’s Initials: _________ Campus: _________________ Date and Time: ____________ Location (e.g. Classroom, Cafeteria, Library, etc.): _____________________________
Praxis: A Writing Center Journal • Vol 15, No 1 (2017) www.praxisuwc.com!