Pages

Wednesday, September 9, 2020

Misperceptions and Misinformation in Elections Campaigns

It's an election season, so many people are widely concerned about  how all those other voters are going to be misinformed into voting for the wrong candidate. Brendan Nyhan provides an overview of some research in this area in "Facts and Myths about Misperceptions" (Journal of Economic Perspectives, Summer 2020, 34:3, pp. 220-36). 

To be clear, Nyhan describes misperceptions as "belief in claims that can be shown to be false (for example, that Osama bin Laden is still alive) or unsupported by convincing and systematic evidence (for example, that vaccines cause autism)." Thus, he isn't talking about issues of shading or emphasis. Nyhan writes: "Misperceptions present a serious problem, but claims that we live in a `post-truth' society with widespread consumption of `fake news' are not empirically supported and should not be used to support interventions that threaten democratic values." 

So why is the belief that everyone on the other side of the political fence is subject to dramatic misperceptions so widespread. One reason is that both academic research and examples of that research in the media tend to focus on examples with partisan distinctions. 
Public beliefs in such claims are frequently associated with people’s candidate preferences and partisanship. One December 2016 poll found that 62 percent of Trump supporters endorsed the baseless claim that millions of illegal votes were cast in the 2016 election, compared to 25 percent of supporters of Hillary Clinton (Frankovic 2016). Conversely, 50 percent of Clinton voters endorsed the false claim that Russia tampered with vote tallies to help Trump, compared to only 9 percent of Trump voters. But not all political misperceptions have a clear partisan valence: for example, 17 percent of Clinton supporters and 15 percent of Trump supporters in the same poll said the US government helped plan the terrorist attacks of September 11, 2001.

One of my favorite examples is a study which showed respondents pictures of the Inauguration Day crowds for  President Obama in 2009 and President Trump in 2017.: "When the pictures were unlabeled, there was broad agreement that the Obama crowd was larger, but when the pictures were labelled, many Trump supporters looked at the pictures and indicated that Trump’ crowd was larger, an obviously false claim that the authors refer to as `expressive responding.'” (I love the term "expressive responding.")

Sometimes that people are aware of slanting their answers in this way. When people give these kinds of answers to poll questions, they often know (and will say when asked) that some of their answers are based on less evidence than others. One study offered small financial incentives (like $1) for accurate answers, and found that the partisan divide was reduce by more than 50%.  

But other times, people make meaningful real-world decisions based on these kinds of partisan feelings. as one example with particular relevance just now, evidence from the George W. Bush and Barack Obama administrations suggests that when the president you supported is in office, people "express more trust in vaccine safety and greater intention to vaccinate themselves and their children than opposition partisans," which shows up in actual patterns of school vaccinations. 

An underlying pattern that comes up in this research is that if people are exposed to an concept many times (an example is the false statement “The Atlantic Ocean is the largest ocean on Earth”), they become more likely to rate it as true. The underlying psychology here seems to be that when a claim seems familiar to people, because of repeated prior exposure, they become more likely to view it as true. An implication here is that while those who marinate themselves in social media discussions of news may be more likely to think of themselves as well-informed, they are also probably more likely to have severe misperceptions. Indeed, people who are more knowledgeable are also the same people who have become aware of how to deploy counterarguments so that they believe their misperceptions even more strongly. 

Nyhan's paper mentions many intriguing studies along these lines. But do we need public action to fight misperceptions? It's not clear that we do. A common finding in these studies is that if someone discovers and admits that they have a misperception on a certain issue, it doesn't actually change their partisan beliefs.  "Fact-checking" websites have some use, but they can also be another way of expressing partisanship--and those who hold misperceptions most strongly are not likely to be reading fact-checking sites, anyway. Even general warnings about "fake news" can backfire. Some research suggests that when people are warned about fake news, they become skeptical of all news, not just part of it. One interesting study warned a random selection of candidates in nine states who were running for office in 2012 that the reputational effects of being called out by fact-checkers could be severe, and found that candidates who received the warnings were less likely to have their accuracy publicly challenged. 

Nyhan concludes with this response to suggestions for more severe and perhaps government-based interventions against misperceptions: 

Calls for such draconian interventions are commonly fueled by a moral panic over claims that “fake news” has created a supposedly “post-truth” era. These claims falsely suggest an earlier fictitious golden age in which political debate was based on facts and truth. In reality, false information, misperceptions, and conspiracy theories are general features of human society. For instance, belief that John F. Kennedy was killed in a conspiracy were already widespread by the late 1960s and 1970s (Bowman and Rugg 2013). Hofstadter (1964) goes further, showing that a “paranoid style” of conspiratorial thinking recurs in American political culture going back to the country’s founding. Moreover, exposure to the sorts of untrustworthy websites that are often called “fake news” was actually quite limited for most Americans during the 2016 campaign—far less than media accounts suggest (Guess, Nyhan, and Reifler 2020). In general, no systematic evidence exists to demonstrate that the prevalence of misperceptions today (while worrisome) is worse than in the past.
Or as I sometimes say, perhaps the reason for disagreement isn't that the other side has been gulled and deceived, and if they just learned the real true facts then they would agree with you. Maybe the most common reason for disagreement is that people actually disagree.