Backfire: Watching Madness in Real Time

October 4, 2020

One of the sad realizations we’ve had in the past 20 years, as scholars have taken a close look at the human ability to deny reality, is that, even when conspiracy theorists are confronted with empirical evidence that their ideation is objectively untrue, rather than relenting and rethinking, they double down.

Raw Story:

President Donald Trump is hospitalized after testing positive for coronavirus but his base still refuses to accept that the reality of the situation: coronavirus is real.

Many of Trump’s supporters are already circulating conspiracy theories to downplay the spread of the virus. Others are attempting to justify how the president may have contracted the virus insisting someone may have “planted it,” according to The Guardian.

Guardian:

Sean Patterson is not worried that Donald Trump has been hospitalized with coronavirus because he believes what the president tells him.

“It’s a hoax. There’s no pandemic. As Trump said, how many millions die of flu?” said the 56-year-old truck driver outside the early voting station in St Joseph, Missouri – a stronghold for the president.

But then Patterson pauses and contemplates the possibility that Trump really does have Covid-19.

“If he’s sick, then they planted it when they tested him. It’s what they did to me when I went to hospital for my heart beating too fast. Two weeks later I got a cold,” he said. “It’s political. I don’t trust the US government at all. Who are they to mandate personal safety? I listen to Trump.”

Jstor Daily:

Nyhan and Reifler devised four studies with college undergraduates as their subjects to look precisely at the power of factual corrections. Part of their study used actual quotes on the issue of weapons of mass destruction (WMD) in Iraq in the prelude to the 2003 U.S. invasion, and subsequent corrections. (Iraqi WMD were used as the main pretexts for war by the administration of George W. Bush. None existed.)

Some of the students were given corrections to statements of American officials—fact checks, as we now know them. Nyhan and Reifler found that corrections “frequently fail to reduce misperceptions among the targeted ideological groups.” They also found that that the “backfire effect,” “in which corrections actually increase misperceptions,” was also seen. Some people double-down on their misconceptions after being shown proof that they’re wrong. They become more convinced of their opinion.

Skeptical Science:

What happens when you remove that element of choice and present someone with arguments that run counter to their worldview? In this case, the cognitive process that comes to the fore is Disconfirmation Bias, the flipside of Confirmation Bias. This is where people spend significantly more time and thought actively arguing against opposing arguments.

This was demonstrated when Republicans who believed Saddam Hussein was linked to the 9/11 terrorist attacks were provided with evidence that there was no link between the two, including a direct quote from President George Bush.3 Only 2% of participants changed their mind (although interestingly, 14% denied that they believed the link in the first place). The vast majority clung to the link between Iraq and 9/11, employing a range of arguments to brush aside the evidence. The most common response was attitude bolstering – bringing supporting facts to mind while ignoring any contrary facts. The process of bringing to the fore supporting facts resulted in strengthening people’s erroneous belief.

If facts cannot dissuade a person from their pre-existing beliefs – and can sometimes make things worse – how can we possibly reduce the effect of misinformation? There are two sources of hope. 

First, the Worldview Backfire Effect is strongest among those already fixed in their views. You therefore stand a greater chance of correcting misinformation among those not as firmly decided about hot-button issues. This suggests that outreaches should be directed towards the undecided majority rather than the unswayable minority.

Second, messages can be presented in ways that reduce the usual psychological resistance. For example, when worldview-threatening messages are coupled with so-called self-affirmation, people become more balanced in considering pro and con information.4,5

Self-affirmation can be achieved by asking people to write a few sentences about a time when they felt good about themselves because they acted on a value that was important to them. People then become more receptive to messages that otherwise might threaten their worldviews, compared to people who received no self-affirmation. Interestingly, the “self-affirmation effect” is strongest among those whose ideology was central to their sense of self-worth. 

Another way in which information can be made more acceptable is by “framing” it in a way that is less threatening to a person’s worldview. For example, Republicans are far more likely to accept an otherwise identical charge as a “carbon offset” than as a “tax”, whereas the wording has little effect on Democrats or Independents—because their values are not challenged by the word “tax”.6

Self-affirmation and framing aren’t about manipulating people. They give the facts a fighting chance.

2 Responses to “Backfire: Watching Madness in Real Time”

  1. Keith McClary Says:

    Nowadays the same sorts of people (and often the same people who gave us Iraq WMDs ) are spewing evidence-free warmongering accusations against various “adversary” countries.


Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: