I need help with this assignment. It is APA format and I have attached the documents that are needed to be used below.
Bianca Davidson Bianca Davidson Bianca Davidson Bianca Davidson Bianca Davidson Bianca Davidson Bianca Davidson Bianca Davidson Bianca Davidson Bianca Davidson Bianca Davidson Bianca Davidson Recognizing Acts of Reading: Creating Reading Outcomes and Assessments for Writing Holly Middleton ABSTRACT While it is a truism in Composition Studies that academic reading and writing are integrated, reading is not consistently theorized with the same rigor as writ- ing, in both the program and scholarly domains. Yet when the role of reading in a writing program is not articulated, students can experience classroom expec- tations for reading and writing working at cross-purposes. This essay recounts how one WPA responded to an administrative imperative to improve reading scores by designing and implementing an assessment of reading in Basic Writ- ing classes. What resulted was a collaborative process to account for the often overlooked role of reading that further aligned our curriculum. Articulating reading objectives for composition courses has the potential to improve program outcomes and student learning, and more research is necessary to help WPAs in this process. One day in 2007, during my first year as WPA at New Mexico Highlands University, the Dean came into my ofiice witb a sheet of paper in his hand and showed me a graphed distribution of student test scores. He pointed to the five or six marks on the far left, those that represented scores convert- ing to 3"'- and 4'''-grade reading levels and asked, "what are we doing to help these students?" As a new WPA, I felt overwhelmed by the question. As someone trained in composition studies, I knew that test scores did not assess a student's "general reading ability." Even so, the test scores indicated struggles in reading comprehension or testing situations that seemed wholly outside of my area, and I wondered how to address tbe needs of students who were represented on the sheet in front of me as statistical outliers. The Dean and I value different kinds of data—he wants to see numbers go up, distributions move to the right—but I am grateful that he asked and kept 11 WPA 36.1 (Fall/Winter 2012) asking, as the research and collaboration ensuing from that initial conver- sation has greatly improved our program. What are we doing to help these students, indeed? This essay serves two purposes: the first is to demonstrate a process undergone to design and implement reading outcomes and assessments in our Basic Writing-equivalent course. We first implemented a reading examination in fall 2009, and our 2009-2010 and 2010-2011 one-group pre/post-test study results show a statistically significant increase of 12 points in mean test scores. What began as an administrative mandate trans- formed into a productive interrogation of the role of reading in our pro- gram, and this account of the process may be helpful to WPAs currently working under similar administrative imperatives. While a curriculum and its assessment is context-dependent, an account of how one program designed and assessed learning outcomes and met administrative mandates may especially serve WPAs at universities whose Basic Writing programs are currently threatened with elimination. Finally, while my focus is on our Basic Writing course (because its "remedial" status prompted the admin- istration's demand for reading improvement) the process of creating out- comes and assessments is of course relevant to any writing program. My second purpose is to argue that articulating the role of reading in a writing course can help align a program's objectives, assessments, and cur- riculum, consequently improving student learning. To this end, more schol- arly attention to the role of reading in writing programs would strengthen the field. In an online search of WPA annual meeting abstracts for the last three years, I noted that in 2009 and 2010 only one panel was explicitly devoted to reading, while the 2011 meeting marked an increase to two. I believe this means we are giving insufficient critical attention to what con- stitutes a great portion of our work. We would do well to remember that learning to write for a new discourse community requires learning to read for it, that the challenges any beginning graduate student experiences in meeting expectations for writing, for instance, are largely shaped by the demands of graduate-level reading. Such is the case for all first-year col- lege students, whether or not they are placed in Basic Writing. For a WPA, elaborating the full implications of reading as a composing process and its constitutive relation to academic writing requires time and buy-in, but it can make a writing program more coherent and its objectives more visible, and achievable, to instructors and students. 12 Middleton / Recognizing Acts of Reading THE ROLE OF READING IN A WRITING PROGRAM The "basic writer" varies by institution and is always a local construct, making the student population and placement procedures crucial to under- standing who this learner is at any institution (Gray-Rosendale, et al. 42). So first, some context. New Mexico Highlands University is an open- admissions Hispanic-serving institution in Las Vegas, New Mexico, popu- lation 14,000. In fall 2008, 59% of our first-year students were first-gener- ation; 51% were low-income, meaning their families reported incomes of less than $45,000; and 17.9% came from families with incomes below the federal poverty line. However, due to the state's scholarship program, any New Mexico high school graduate who immediately enters college receives scholarship funds. In fall 2008, 92.9% of full-time first-year (and first-time) Highlands students received some kind of tuition scholarship {New Mexico Highlands University Self-Study 14). At Highlands, incoming freshmen are placed through ACT and COMPASS Reading scores: students who score lower than 17 on the ACT or lower than 80 on the COMPASS Read- ing exam place into English 100, or "Reading and Writing for College." Approximately 40% of incoming freshmen place into English 100; most of these students are Hispanic and male. Highlands students who place into English 100 frequently test at scores that convert to 5th—7th grade reading levels. Given the low socioeconomic status (SES) of our student population and the correlation between SES and standardized test scores, this is not surprising. For example, using 2001 national data, Rebecca Zwick found that the average ACT Compos- ite score for students with family incomes of over $100,000 was 23.4; for students from families with incomes less than $18,000 the average was 18.1 (205). This correlation between family income and ACT scores consistently appears at Highlands: in 2001, 80% of our students reported ACT Com- posite scores, the average of which was 1792, and it has varied only within a one-point range in the years since. While the link between wealth and test scores may be familiar ground, Zwick found that other indicators of academic achievement such as class rank and GPA are also linked to family income, so intertwined is SES with educational opportunity. Teacher qual- ity is also a factor; Zwick found that the most credentialed and experienced teachers are concentrated in schools with the lowest proportion of students eligible to receive free or reduced-price lunches. In 2007, 66.8% of elemen- tary school students and 52.8% of high school students in New Mexico were eligible for free or reduced lunches, a state concentration of poverty second only to Mississippi (Table A-25—3). 13 WPA 36.1 (Fall/Winter 2012) I explain these correlations to show how I began to read our students' placement and diagnostic scores. Highlands students' ACT or COMPASS scores are not static measures but an indication of their experience with schooling and their history of educational opportunity. Few will initially demonstrate the classroom attitudes and behaviors that lead to the social and economic benefits accruing to middle- and upper-middle-class stu- dents. In this way, our student test scores provided local evidence of inequi- ties in educational opportunity and motivated me to consult the impressive body of research on reading, most of which is conducted within education, language, and developmental studies. Until 2009, administrators judged English lOO's effectiveness through one measure: a standardized COMPASS pre- and post-test that ostensibly measured improvement in reading skills but was not connected to course content in any way. Every instructor designed his or her own syllabus, requiring different texts and assignments. Students were initially tested in class, but later testing moved to the Student Services building. In both test- ing situations, about half would consistently score lower on the post-test than on the pre-test, indicating an assessment result that could just as likely have occurred if no writing course were taken at all. I should say here that unlike many universities. Highlands sustains a commitment to its diverse student population, a commitment also evident in the fact that people of color comprise 38% of Highlands faculty. To my knowledge, proposals to eliminate Basic Writing never get off the ground. The administration and faculty consider English 100 central to the university's mission, but this translates to a tremendous amount of oversight. For example, the under- graduate catalog description for English 100, "Reading and Writing for GoUege," stipulates that in addition to earning at least a G in the course, all students will "pass a committee-graded exit exam" to enroll in the first-year composition course, English 111 {NewMexico Highlands University Under- graduate Catalog 50). The G requirement, the exit exam, the use of invalid standardized tests to assess learning and the withholding of graduation credit all mark an iterative process of addressing learning through account- ability measures. In the pursuit of objective assessment and oversight, the administration eventually began putting pressure on the English depart- ment to take assessment out of the hands of instructors who taught the course and under the purview of tenured and tenure-track faculty, many of whom never taught the course at all, a proposal that did not sit well with anyone in English. This institutional situation compelled me to develop alternative teacher-driven methods of assessment and oversight that would yield quantitative data, beginning with reading. 14 Middleton / Recognizing Acts of Reading Rather than an elementary activity, reading comprehension is itself a complex set of practices implied, but not usually elaborated, in our writing programs. Tenaha O'Reilly and Kathleen Sheehan propose a framework for assessing reading competency that privilege "model-building" and "applied comprehension," two required skills for developing reading competency. "Model-building" here refers to the reader's activity of constructing a "men- tal model of a text's meaning." Inferring, generalizing, and summarizing are all acts of model-building. "Applied comprehension" refers to the act of taking it further, of using the constructed model to "achieve a particular goal (e.g., solve a problem, make a decision, create a presentation or Web site)" (5). As writing instructors, we require students to perform applied reading comprehension when we ask them to evaluate, critique, integrate, synthesize, or explain (4). A standardized test's efficacy is limited here, as it cannot capture how reading functions in our writing courses. Instead, research on assessment in reading, much like research on assessment in writing, tells us that we should assess the processes we teach in the course.