Monty Python Argument Clinic Sketch

In 2010, researchers Brendan Nyhan and Jason Reifler published a study documenting a phenomena they called the “backfire effect.” When shown facts that contradict their ideologically-held beliefs, says the study, many people double down and take an even more extreme position than before.

The story the backfire effect tells about the intransigence of our political opponents is a very appealing one for American political commentators. If you’ve concluded that the opposition is beyond reason, you no longer have to attempt to persuade them, just threaten them.

But like many social science truisms of the last 40 years, the backfire effect has failed the replication test.

Ethan Wood and Thomas Porter assessed the reactions of over 10,000 subjects to “52 issues of potential backfire” in a series of studies and were unable to replicate the backfire effect. From an interview the two researchers did with Poynter:

I’m hesitant to use the word “unicorn,” but there’s something about backfire… I have seen so many videos of people making a soufflé and I have tried for years but I cannot get a soufflé to rise.

Backfire is not quite like that, but it does feel like something that you should write to your friends and be excited about when you observe it. Backfire is very unusual, and I don’t think it should be something that affects the way fact-checkers work.

But, says Wood, “we didn’t see any differences on policy preferences among corrected and uncorrected groups…[w]e can make the factual intervention and they’ll cohere with the facts but they may still have the preferences they had beforehand.” [Emphasis mine.]

This may sound like more excuse-making; more mental bulwarks members of the other team have erected to protect themselves against empirical reality. But as the data gathered in the Wood-Porter studies indicate, humans are pretty good at internalizing new information. It’s just that, as Wood points out, facts are “only one component of the way that average people come to hold political preferences.”

Explaining to an opponent of gun-control, for example, that firearms kept in the home are by far more likely to be used on someone living in the home than a dangerous criminal breaking into the home will not be persuasive if she believes the Second Amendment is an important check against a tyrannical government. She understands the statistic, but it doesn’t override her main concerns.

She is not stupid or evil. She just has a different worldview. And just because she hasn’t changed her position on guns doesn’t mean that presenting her with that statistic has had no effect.1

In an episode of The Skeptic’s Guide to the Universe podcast, neurologist Steven Novella notes that you’re never going to end an argument with your opponent changing sides. What you can do is plant a seed of doubt in your opponent’s mind. Then, weeks or months later, she’ll find herself repeating your points in an argument with someone else.

But even if you never actually persuade other people to change their minds, I don’t think arguing is pointless. You’re forcing them to reflect on their values in a way they probably wouldn’t by themselves. And at the very least, they’ll come away understanding that the situation is more nuanced than they previously thought. That alone seems valuable.

  1. [UPDATE] Another example, this one on two differing views on the #MeToo movement, can be heard here