What’s the Real Problem with Fake News? We Are Wired to Believe It
Sep 14, 2017
By Jason Ohler
In 1951, a football game inspired what would become a landmark study in psychology. Dartmouth opposed Princeton in a brutal end-of-season match that yielded a broken nose, a broken leg, and a flurry of penalties. The game's lack of sportsmanship became the topic of much public debate, with each side blaming the other for the lack of civility on the field.
A pair of psychologists, Albert Hastorf from Dartmouth and Hadley Cantril from Princeton, decided to study the polarizing reactions to the 1951 football matchup as a perceptual problem. They administered a questionnaire to a sample of students from both universities. They also showed a recording of the game to separate samples of their students. In both cases the researchers asked the participants , “Being as objective as you can possibly be, what did you see?”
The results? Participants overwhelmingly “saw” a version of the game that was aligned with their preconceived allegiances, or with whichever team they were cheering for. Princeton students blamed the Dartmouth players for the violence; Dartmouth students condemned Princeton's team for the injuries. Though the students watched the same football game, the subjects in the study expressed the belief that whichever team opposed theirs was at fault for the game’s escalating nastiness. The researchers’ conclusion, which appeared in their report titled They Saw a Game: A Case Study
, reads as follows: “It seems clear that the ‘game’ actually was many different games and that each version of the events that transpired was just as ‘real’ to a particular person as other versions were to other people.”
Confirmation Bias—Fake News’ Best Friend
The case study from Dartmouth-Princeton matchup is one of many studies over the years that has confirmed an essential truth about our human nature: we see what we want to see. More importantly, we look for evidence that supports the belief systems we already have in place. The psychological community has given this phenomenon a name: confirmation bias. Confirmation bias means that we interpret new information in a way that confirms what we already believe to be true. For example, many people will interpret the actions of a politician as “good” or “bad” depending on their party affiliation. A person may only research defenses of the politician if they share their political party, or a person may only research criticisms of the politician if they are a member of an opposing political party. We hold confirmation bias to avoid the confusion that comes with conflicting perspectives and worldviews concerning matters that are important to us. It is important to be aware of confirmation bias when we consume media—especially in the wake of “fake news”, or media composed of fabrications or warped truths.
Test Your Own Bias
To drive home the reality of confirmation bias to my students, I ask them to observe their media input for a few days. They monitor the TV programs they watch, which newspapers and magazines that they consult, social media sites they utilize, YouTube videos they view, and anything else they use as source of information. Then, I ask them to use the objective inquiry skills they have developed as social scientists to notice the confirmation bias they use to build their worldview based on their choice of media sources. Every one of my intelligent, self-aware students is surprised and often shocked at the constraints of the “filter bubbles” they live within—all of which confirm rather than challenge their biases.
We are all in the same boat. We gravitate toward information that supports our worldview; whether the information is real or fabricated often isn't even a concern. We go about our business convinced that we are responsible and balanced in our decisions, when in reality we have only affirmed our own biases.
This simple addition recognizes that students live two lives, one in the real world and one in the digital world.
Circumventing Critical Thought
Our brains, always on the lookout for ways to save energy
, prefer to use habit, mental coasting, and automatic “team think” rather than engage in critical thought. This means that when we are faced with a new issue, our typical response is not to pause, check our sources, and then consider our options, especially if those options threaten to challenge or broaden our perspectives. That mental process is simply too much work during a taxing day in which we are expected to evaluate prodigious amounts of information. Instead, we use a familiar tendency to latch onto whatever “the team is thinking” as a way of keeping afloat in the oceans of data in which we are continually immersed. The goal is to think as little as possible. Using confirmation bias is a quick and easy way to meet this goal and to deal with a world overwhelmed with options.
What‘s Our Response?
We have to be vigilant and even suspicious of those who deliberately mislead and those who spread fake news simply because they have been misguided by others. The mediascape is like any other community in that it only works as well as its citizens’ commitment to facticity, diversity, and the common good.
When it comes to teaching our students, we should insist that our schools teach media literacy and digital citizenship. Whether our children are consuming or producing media, they should be able to distinguish entertainment from journalism and differentiate opinion from factual presentation. They should learn to critically inquire about a source’s agenda and means of presentation. Critical reading and thinking ought to become a staple of our education curriculum as opposed to an extra-curricular pursuit.
We must teach our students to interact with media sources in a way that is informed, unbiased and responsible. From the Talmud comes a saying made famous by Anaïs Nin, which seems to explain so much: “We do not see things as they are, we see them as we are.” The message for us is clear: we need to teach our children not only how to think, but also how to be. After all, the quality of our news is determined by the quality of the people who create it.
Dr. Jason Ohler
is a professor emeritus, speaker, writer, and a lifelong digital humanist who is well-known for the passion, insight, and humor he brings to his writings, projects, teaching, and presentations. He has been helping community members, organizations, and students at all levels understand the ethical implications of being digital citizens in a world of roller-coaster technological change. His most recent book, 4Four Big Ideas for the Future
, reflects on his 35 years in the world of educational media and innovation in order to chart a course for a future. He is first and foremost a storyteller, telling tales of the future that are grounded in the past. Find Jason on Twitter @jasonohler
or visit his website: JasonOhlerIdeas.com
Shakespeare: "there is nothing either good or bad, but thinking makes it so."