When Pigs Fly...
The old saying that "a lie can be halfway round the world before the truth has got its boots on" may not be nearly gloomy enough. Turns out that there's a ton of research that says even if the truth ever catches up, the lie will stick around.
Just about every religion places a high value on its version of the truth. Truth is even described by some faiths as a sort of eternal disinfectant, able to scrub the effects of misinformation.
"The truth will make you free," says the Christian scriptures.
Secular pursuits also depend on a belief in the cleansing power of truth. I've been a reporter for more than 30 years. My profession takes it on faith that if we can dig out the truth, we can undo the effect of lies. This election season, more than any other, reporters of various kinds have devoted uncounted hours to fact-checking and issuing corrections. Always with the assumption that we can negate the impact of a falsehood if only we make the error clear.
But a paper published in the most recent edition of Psychological Science in the Public Interest may be the most depressing bit of academic work I've ever read. The cheery title is "Misinformation and Its Correction: Continued Influence and Successful Debiasing." It's the work product of five scholars, all but one working in Australia. It's not a report on original research. Rather it's an encyclopedic survey of existing research concerning the persistence of misinformation.
The bottom line claim: once a mind is besmirched by the stain of misinformation, it's seldom truly clean ever again.
"It is extremely difficult to return the beliefs of people who have been exposed to misinformation to a baseline similar to those of people who were never exposed to it."
And that's true even if people know that they're being given an accurate correction of misinformation.
"For example, the U.S. Centers for Disease Control and Prevention offer patient handouts that counter an erroneous health-related belief (e.g., "The side effects of flu vaccination are worse than the flu") with relevant facts (e.g., "Side effects of flu vaccination are rare and mild")...After a mere 30 minutes, readers of the handouts identify more 'myths' as 'facts' than do people who never received a handout to begin with...Moreover, people's behavioral intentions are consistent with this confusion: they report fewer vaccination intentions than people who were not exposed to the handout."
The paper describes an experiment where people are given an account of a fire. Shortly thereafter, key details are corrected.
"Research using this paradigm has consistently found that retractions rarely, if ever, have the intended effect of eliminating reliance on misinformation, even when people believe, understand, and later remember the retraction...In fact, a retraction will at most halve the number of references to misinformation, even when people acknowledge and demonstrably remember the retraction...."
And by the way, education doesn't help.
"...[A]mong Republicans, greater education was associated with a greater increase in the belief that President Obama was a Muslim (he is not) between 2009 and 2010...Among Democrats, few held this mistaken belief, and education did not moderate the effect."
Why? The paper offers a bunch of possible explanations that have to do with the way we form mental maps of our world.
Maybe our memories grab what we hear first and then add the "correction" tag. But the original memory of the misinformation remains clearer and seemingly more credible. Maybe people just don't like to be told what they've already come to believe is true isn't. (See: death panels.) Maybe people prefer a complete but wrong explanation over one where the error is corrected, but leaves a gap in understanding.
Or maybe something else entirely. Who knows? None of the explanations offered seemed particularly compelling.
The scholars offer some possible correctives to the problem that range from the equivalent of chicken soup - what could it hurt? -- to laughably unrealistic.
Chicken soup: Encouraging people to have a skeptical worldview, so they'd be less likely to buy in totally to purported information before they check it out. Like telling people they need to exercise more and eat less, it's perfectly good advice.
Unicorn talk: "Misinformation effects can be reduced if people are explicitly warned up front that information they are about to be given may be misleading."
Um. The liars are supposed to hold up a sign saying "don't trust this" before they deliver their misinformation? Sounds like a Monty Python sketch.
A couple of the paper's authors produced a "debunking handbook" that is, if possible, even more discouraging than the paper. For instance:
"The evidence indicates that no matter how vigorously and repeatedly we correct the misinformation, for example by repeating the correction over and over again, the influence remains detectable. The old saying got it right - mud sticks."
I sent the lead author an email asking if he could offer a ray of hope from the abyss his paper portrays. Stephan Lewandowsky is a cognitive scientist in the psychology department of the University of Western Australia.
"The fact that much is being written about misinformation at the moment is itself a sign that a growing number of people are recognizing the problem," he replied. "That can only be positive in the long run."
I didn't find that terribly cheering. He continued:
"The amount of misinformation is potentially discouraging, yes, but we also know that political leadership could make a big difference," he wrote.
Oh, right. The efficacy of my profession depends on politicians to create a cultural climate that reduces the acceptance of misinformation? Having watched the presidential campaign debates, I expect that will happen day now. The same day that pigs fly.
Hm. "Flying pigs" is an example of misinformation. So now, some small part of your mind will always believe in porcine aviators. Even when I tell you it's not true.