Teaching Critical Reading and Writing in the Era of Fake News
Summary
Excerpt
Table Of Contents
- Cover
- Titel
- Copyright
- About the Author
- About the Book
- This eBook can be cited
- Contents
- List of Figures and Tables
- Acknowledgments
- 1. Introduction
- Part I. Disciplinary Responses to the Era of Fake News
- 2. The Reading Moves of Writing Teachers Debating Online
- 3. The Fox and the OWL: Pedagogical Lessons from a Real-World Fake News Controversy
- 4. Search(able) Warrants: Fostering Critical Empathy in the Writing (and Reading) Classroom
- 5. What Is ‘Fake News’? Walls, Fences, and Immigration: How Community-Based Learning Can Prompt Students to Employ Critical Reading and Research Practices
- Part II. Composition Classroom Practices in the Era of Fake News
- 6. Factual Dispute: Teaching Rhetoric and Complicating Fact-Checking with The Lifespan of a Fact
- 7. Fighting Fake News with Critical Reading of Digital-Media Texts
- 8. Critical Science Literacy in the Writing Classroom: A Pedagogy for Post-truth Times
- 9. The Resurgence of the Pacific Northwest Tree Octopus: How Instructors Can Use New Media to Increase Students’ Awareness of Fake News
- 10. Teach from Our Feet and Not Our Knees: Ethics and Critical Pedagogy
- 11. News as Text: A Pedagogy for Connecting News Reading and Newswriting
- Part III. Teaching Visual and Digital Media Literacy in the Era of Fake News
- 12. How Information Finds Us: Hyper-Targeting and Digital Advertising in the Writing Classroom
- 13. Preparing Students to Read and Compose Data Stories in the Fake News Era
- 14. Sleuthing for the Truth: A Reading and Writing Pedagogy for the New Age of Lies
- 15. Hacking Fake News: Tools and Technologies for Ethical Praxis
- Notes on Contributors
- Index
- Series Index
Figures and Tables
Figure 7.1: Course Major Assignments: Source-Author
Figure 7.2: Activity Prompt on Bias Detection in News Sources. Source: Author
Figure 13.1: A and B: Two Data Stories Based on the Same Data Set. Source: Author
Figure 13.2: A and B: Comparison of Two Bar Charts. Source: Author
Figure 13.3: Sample Data Story. Source: Author
Figure 14.1: Wolf Pack Source: Nature Picture Library
Figure 15.1: Evaluation Heuristic Source: Author
Acknowledgments
We are grateful to the Research Office (Dr. David Stone, Director) and the College of Arts and Sciences (Dr. Kevin Corcoran, Dean) at Oakland University for their support of this project.
1. Introduction
The central goal of this collection is twofold: it will help faculty understand where students are with respect to reading and understanding the information they encounter on a daily basis (whether that information comes in the form of nonfiction prose they are reading in their courses or the news stories that pop up on their social media accounts), and it will help faculty improve students’ critical reading, writing, and thinking abilities. In the current environment where it is difficult to evaluate claims of “fake news,” “alternative facts,” misinformation and disinformation, expert skills are needed more than ever. In this Introduction, we offer an overview of recent research that shows the current situation in college classrooms across the country, followed by an explanation of the basic psycholinguistic features of the reading process. We know from our own research and that of others that many teachers of college-level writing (and other subjects) have little or no background in the psycholinguistics of reading, so our goal here is to provide a solid backdrop for the chapters. Part of the context we provide in this Introduction also involves bringing to light research we have done separately that documents the absence of attention to reading in the field of composition and rhetoric. Finally, we demonstrate that there are sound reading pedagogies emerging from the field that, when taken alongside those discussed throughout this collection, will give faculty the tools they need to support students’ development of the critical reading, writing, and thinking abilities they will need in this era of fake news.
Students’ Reading Habits
If you get average college faculty from a variety of disciplines together in a room and ask them what the single biggest problem they face in the classroom today is, in less than five minutes, the discussion will be about reading. The focus will be on what Alice calls the “don’t, won’t, can’t” problem. That ←1 | 2→is, students don’t read as part of their regular range of activities, won’t read assigned material unless it is tied to a grade, and really can’t read in the ways most instructors expect or require. Their ability to comprehend, analyze, synthesize and make ethical use of extended nonfiction prose material is quite limited.
There is much evidence beyond faculty complaints to support the view that current college students have significant trouble with reading. The available evidence arises from an ever-growing stack of reports of studies on student reading from a variety of vantage points, quantitative, qualitative, online and off. Virtually all the studies suggest that half or more of students lack the critical reading skills to be successful in college. And while correlation is not causation, when the findings of reading studies are juxtaposed with data showing that roughly half of those who start any form of post-secondary education never complete a degree (Musu-Gillette et al. 2017), it seems at least possible that there might be a relationship between reading problems and crossing the finish line. Thus, there are many reasons to be concerned about students’ reading situation; readers of this book need to be aware of students’ ability levels as they consider strategies and approaches for the classroom.
The quantitative data come from large-scale tests of various kinds. The ACT organization has been tracking students’ performance in a reading test for a number of years. The ACT test section devoted to reading comprehension consists of a 35-minute timed test. Students read four passages, answering ten multiple choice questions on each passage. Although there are many flaws in such an approach, ACT claims that it is measuring students’ comprehension, vocabulary, ability to draw inferences and other features of critical reading. The most recent results from the ACT Reading test show an ongoing decline in the number of students scoring at or above the cut-off score that ACT has set. The latest results available as this book is being written are from 2019 when 1.78 million students or 52% of the US high school graduating class took the test. Of these students, only 45% met the ACT College Readiness Benchmark for reading, down from 46% in 2018 and 47% in 2017.
Related large-scale results come from the National Assessment of Educational Progress (NAEP), sometimes referred to as the “Nation’s Report Card” because it tests a US Department of Education sample of students meant to reflect the student population as a whole. NAEP is not administered to high school students every year, so the latest results are from 2015, when some 37% of students in Grade 12 tested proficient in reading (NAEP, 2016). Thinking about the classroom implications of these findings, in a typical class of about 25 first-year college students, half or more of them are likely to lack the critical reading skills needed to do the reading instructors assign. And ←2 | 3→that’s not even considering the time demands, the distractions, and the other dis-incentives to thoughtful, thorough critical reading expected in college. Again, the results from the ACT and NAEP are based on timed, multiple choice, paper and pencil kinds of tests. However, other measures, untimed, online and so forth, produce very similar findings.
For example, while a 2016 report of a sampling of college students’ ability to evaluate news sources (Head et al. 2016) shows a weak ability to apply critical reading to news in both traditional and digital forms, more persuasive evidence comes from an earlier study done by this same team of librarians from the University of Washington (Head et al., 2013). The older study looked at more than 1900 first-year college students’ responses to an online survey, along with follow-up interviews with 35 respondents conducted at six colleges and universities around the country. It found that students have difficulty finding, reading, understanding, evaluating and using research materials for their own purposes. In this case, the study relies on students’ self-reported difficulties with college-level reading and research, and yet, it finds that a majority of students report that critical reading is difficult. Similar findings on international direct tests of reading ability with 15-year-old students in about 30 mostly Northern Hemisphere countries show again that half or more of students do not read as well as they could and should (Programme, 2012).
For those who have well-justified reservations about large-scale tests and surveys of this kind, there are other carefully done qualitative studies that offer similar findings. In a nutshell, no matter how critical reading skills are measured, half or more of the students currently in college classrooms lack the ability to analyze, synthesize, evaluate and make ethical use of what they read. Two of the many qualitative studies done to support this claim are the on-going work of The Citation Project and that of the Stanford History Education Group. The Citation Project has taken a national sample of students’ use of sources in actual writing assignments from colleges and universities across the country (Jamieson & Howard, 2016). Researchers examined each source used in the sample of 2000 citations to see exactly what the writer did in each case. Here are the findings: only 6% use real summary; 46% cite from the first page of a source; 70% from the first 2 pages of the source, and a majority of the sources were cited only once. The implications of these findings will be clear to most readers: students do not read sources critically and in full to present the substance of an argument, relying instead on what these researchers refer to as “quote mining” to patch together sources for their own papers.
While the Citation Project results draw on students’ use of sources they may have found in traditional print or online forms, the Stanford History ←3 | 4→Education Group study relied entirely on online materials (Stanford, 2016). The researchers examined more than 7000 student responses to a series of age-appropriate tasks for middle school, high school and college students. Here are the tasks college students were asked to do in an untimed exercise: (1) Article Evaluation: In an open web search, students decide if a website can be trusted; (2) Research a Claim: Students search online to verify a claim about a controversial topic; (3) Website Reliability: Students determine whether a partisan site is trustworthy; (4) Social Media Video: Students watch an online video and identify its strengths and weaknesses; (5) Claims on Social Media: Students read a tweet and explain why it might or might not be a useful source of information. The results showed that somewhere between 50% and 80% of the students could not perform these tasks, a finding the researchers described as “appalling.” A second study in 2019 showed that even PhD-holding historians and presumably well-prepared Stanford undergraduates were not as capable as professional fact-checkers at sorting trustworthy online material from fake and otherwise untrustworthy sources (Wineberg & McGrew, 2019).
The most recent study conducted by the Stanford History Education group (with support from Gibson Consulting) took place between June 2018 and May 2019 and involved 3,446 high school students from across the country. The goal of the study was to “explore whether the intense concern about information literacy since 2016 has had an effect on students’ digital abilities.” With their earlier study in mind, researchers wondered “Are young people today, over three years after our original study, prepared to make choices based on the digital information they consume?” (5). The short answer given by researchers in “Students’ Civic Online Reasoning: A National Portrait” is no. The report explains that “overall, students struggled on all of the tasks,” (14) tasks that “measured their ability to evaluate digital sources” (4). More specifically, “at least two-thirds of student responses were assessed to be at the “Beginning” level [the lowest level] for each of the six tasks.” Moreover, “in four of the six tasks, over 90% of students received no credit at all” and “out of all of the student responses to the six tasks, fewer than 3% [of students] earned full credit” (14).
Given the findings of these studies and many others that will be discussed in the following chapters, it should be clear that secondary and postsecondary students are not the effective, efficient, critical readers instructors expect them to be. And now, more than ever, such skills are essential not only to success in college but also to careers and in the personal lives of students beyond college. Moreover, if we want these students to participate fully in our democratic society, they are going to need careful reading skills to evaluate ←4 | 5→traditional and digital sources of all kinds. The goal of this book is to support faculty as they help students develop these essential abilities.
Psycholinguistic Features of Reading
Reading is a complex activity that involves the interaction of the reader and the writer as they meet in and through the text. College-level academic reading can be defined as a complex, recursive process in which readers actively and critically understand and create meaning through connections to texts, broadly defined (i.e., not just alphabetic texts). The psycholinguistic process of creating meaning via print and/or sound, images, or on a page or screen is then often called upon to inform other processes, including analysis, synthesis, evaluation and application; these processes develop through formal schooling and beyond it, at home and at work, in childhood and across the lifespan and are essential to human functioning in a democratic society (Horning, 2012, p. 41). This process entails the interaction of what readers know (the psycho part) and the language on the page or screen (the linguistic part). But if no meaning is constructed, then the material has not really been read. The students who have run their eyes over lines of print but cannot report on or summarize the content have not read the material. For a different example, those who read mysteries will be focused on motive, method and opportunity of the suspects presented in a case, and may need, once the perpetrator is revealed, to return to the beginning of the book to see missed clues planted by the author. And readers can and will go back because of the focus on meaning in reading.
But those clues may have been missed because readers do not look at or see or need to see every word of a text in order to construct meaning. An exercise Alice does with groups of faculty involves having readers look at a text on the screen that is a simple story with some misprints (double words, spelling errors, and so forth). Most are focused on reading the story and miss the errors, though some do see them. When she puts the passage on the screen for a second look, many readers are surprised at what they did not see. When she puts a second similar passage on the screen, readers are so focused on seeing the errors that they have no idea what happened in the story. These insights are captured in the title of a somewhat dated article that makes this point most clearly, “Reading is Only Incidentally Visual” (Kolers, 1968). Kolers, a distinguished psychologist at the University of Toronto, shows that the brain fills in what readers do not see in the course of normal reading. Again, common sense experience shows that when writers think they have proofread a text carefully, and it is returned (by, perhaps, a vigilant English ←5 | 6→instructor) with assorted typos and other errors marked, this common experience is a by-product of readers’ normal, speedy focus on meaning and on the ability of the brain to fill in what is not looked at or processed by the reader.
These basic features of reading are part of good readers’ everyday experiences with texts of all kinds. But knowing about how reading works can help faculty demonstrate what happens in good reading so they can improve students’ abilities. In fact, teachers in every discipline can reach their own course objectives more effectively by helping students with reading on every assignment. Focusing students’ attention on meaning, even if that means spending some time on the specific vocabulary of the field, can improve students’ comprehension of assigned texts, which is what most instructors want. Setting up a class in small groups to take apart a writer’s argument to examine the evidence and reasoning used will show students the processes instructors expect them to use on their own reading, regardless of topic. Thus, it should be clear that even these few features of good critical reading can help students do the work and learn the material teachers present. The chapters in this book offer a variety of approaches with specific focus on the problems of the materials in the current information landscape.
Because a fundamental understanding of the psycholinguistics of reading is a first step for faculty who want to help students by supporting the development of critical reading skills, we recommend Wolf’s Reader Come Home (2018) or Seidenberg’s Language at the Speed of Sight (2017), which offer engaging and insightful explorations of reading. An older book with useful exercises that can be helpful in showing students how reading works is called On Reading by Kenneth Goodman (1996) that might be found easily in university libraries or through interlibrary loan.
Reading Compliance
Students’ reading abilities are not the only issue we face as instructors. We know from studies that reading compliance is an issue for students at both community colleges and four-year institutions. If we cannot get students to complete the assigned reading and practice various reading strategies then we cannot help them become the engaged and informed citizens they must be in a so-called post-truth culture replete with fake news.
Studies show that students’ lack of reading compliance often has much to do with the lack of attention given to reading by instructors. In many cases, instructors assign reading, but then don’t follow up on it in any significant or sustained way. Students come to understand that the reading assignments are not particularly important as compared to other aspects of the course. In their ←6 | 7→study of community college students, Annie Del Principe and Rachel Ihara (2017) found that students realized that “often reading isn’t truly ‘required’ in their classes and that it’s possible for a student to get by, even succeed, in coursework without doing any/much assigned reading” (200–201). Taking their cues from instructors who are not consistently connecting the reading to the writing for the course, first-year writing students at the University of Michigan indicated that they “were more or less motivated to read assigned texts depending upon whether they viewed that reading as relevant to the writing assignment” (Bunn, 2013). In other words, if the reading was clearly connected to the writing or other major assignments in the course then students completed it. If the purpose of the reading was unclear then students did not feel motivated to read. Finally, in their study at the University of Arkansas, Jolliffe and Harl (2008) found that first-year writing students “were extremely engaged with their reading, but not with the reading their classes” (600). Based on their findings, these scholars make compelling cases for more clearly connecting reading assignments to the writing and other key aspects of courses. While these recommendations inflect this collection, we are proud that the chapters herein also present new research about students’ reading habits that will expand our understanding about how students read, what motivates them to read, and how we can further support their reading.
Teaching in the Era of Fake News
Teaching in this particular climate, marked by the circulation of fake news and alternative facts, is especially challenging and has raised the stakes even higher for literacy instructors at all levels who are responsible for teaching students how to make meaning—through the practices of reading and writing—from the world that surrounds them. No matter where we position ourselves on the left-right political divide, this challenge is presumably something upon which we can agree.
Still being heavily debated, though, is whether the term “fake news,” which appears in the title of this collection, is even useful. Some have pointed out that the increased circulation of the term “fake news” suggests that it is a new phenomenon. We certainly concede that fake news is not new. Satire, parody, propaganda, and false and misleading “news” stories have been around since the beginning of civilization. What is new, however, is the speed at which these stories spread because of social media and related technologies, as well as the ease with which they can be created because of our advanced digital technology.
Details
- Pages
- XII, 258
- Publication Year
- 2021
- ISBN (PDF)
- 9781433175077
- ISBN (ePUB)
- 9781433175084
- ISBN (MOBI)
- 9781433175091
- ISBN (Softcover)
- 9781433188190
- ISBN (Hardcover)
- 9781433175060
- DOI
- 10.3726/b16269
- Language
- English
- Publication date
- 2020 (December)
- Published
- New York, Bern, Berlin, Bruxelles, Oxford, Wien, 2021. XII, 258 pp., 9 b/w ill., 2 tables.