Is There a Vaccine for Misinformation?

No cure for stupid, but….

Washington Post:

Former President Donald Trump falsely claimed the 2020 election was stolen, and his Jan. 6 “Stop the Steal” rally led some supporters to attackthe U.S. Capitol. Members of Congress, including Rep. Marjorie Taylor Greene (R-Ga.) and Rep. Lauren Boebert (R-Colo.), have repeatedly shared covid-19 misinformation and embraced QAnon conspiracy theories. And more than 100 Republican candidates in the 2022 midterms continue to promote Trump’s election fraud claims.

The media has routinely reported on these falsehoods, making it seem like misinformation is rampant in politics. But are candidates for Congress actually sharing more misinformation in 2022 than 2020?

Yes, according to our analysis of congressional candidates’ Facebook posts. We found that politicians in the 2022 election are sharing more links to unreliable news sources than they did in 2020, and the increase appears to be driven by nonincumbent Republican candidates.

But there’s one outlier who’s throwing off the 2022 data: Sarah Palin, who is running as a Republican for Alaska’s sole House seat. As of July 12, 2022, she has shared 849 links to unreliable sources, out of 853 total, for more than 99 percent of her shared sources this year. Palin mostly shares blog posts from her own website, which NewsGuard rates as unreliable. The next closest is Rob Cornicelli, a Republican running in New York, who has shared 88 links to unreliable sources, or 65 percent of his total.

Nieman Lab:

From the COVID-19 pandemic to the war in Ukraine, misinformation is rife worldwide. Many tools have been designed to help people spot misinformation. The problem with most of them is how hard they are to deliver at scale.

But we may have found a solution. In our new study we designed and tested five short videos that “prebunk” viewers, in order to inoculate them from the deceptive and manipulative techniques often used online to mislead people. Our study is the largest of its kind and the first to test this kind of intervention on YouTube. Five million people were shown the videos, of which one million watched them.

We found that not only do these videos help people spot misinformation in controlled experiments, but also in the real world. Watching one of our videos via a YouTube ad boosted YouTube users’ ability to recognize misinformation.

As opposed to prebunking, debunking (or fact-checking) misinformation has several problems. It’s often difficult to establish what the truth is. Fact-checks also frequently fail to reach the people who are most likely to believe the misinformation, and getting people to accept fact-checks can be challenging, especially if people have a strong political identity.

Studies show that publishing fact-checks online does not fully reverse the effects of misinformation, a phenomenon known as the continued influence effect. So far, researchers have struggled to find a solution that can rapidly reach millions of people.

Inoculation theory is the notion that you can forge psychological resistance against attempts to manipulate you, much like a medical vaccine is a weakened version of a pathogen that prompts your immune system to create antibodies. Prebunking interventions are mostly based on this theory.

Most models have focused on counteracting individual examples of misinformation, such as posts about climate change. However, in recent years researchers including ourselves have explored ways to inoculate people against the techniques and tropes that underlie much of the misinformation we see online. Such techniques include the use of emotive language to trigger outrage and fear, or the scapegoating of people and groups for an issue they have little-to-no control over.

Online games such as Cranky Uncle and Bad News were among the first attempts to try this prebunking method. There are several advantages to this approach. You don’t have to act as the arbiter of truth as you don’t have to fact-check specific claims you see online. It allows you to side-step emotive discussions about the credibility of news sources. And perhaps most importantly, you do not need to know what piece of misinformation will go viral next.

But not everyone has the time or motivation to play a game — so we collaborated with Jigsaw (Google’s research unit) on a solution to reach more of these people. Our team developed five prebunking videos, each lasting less than two minutes, which aimed to immunize viewers against a different manipulation technique or logical fallacy. As part of the project, we launched a website where people can watch and download these videos.

We first tested their impact in the lab. We ran six experiments (with about 6,400 participants in total) in which people watched one of our videos or an unrelated control video about freezer burn. Afterward, within 24 hours of viewing the video, they were asked to evaluate a series of (unpublished) social media content examples that either did or did not make use of misinformation techniques. We found that people who saw our prebunking videos were significantly less liable to manipulation than the control participants.

But findings from lab studies do not necessarily translate to the real world. So we also ran a field study on YouTube, the world’s second-most visited website (owned by Google), to test the effectiveness of the video interventions there.

For this study, we focused on U.S. YouTube users over 18 years old who had previously watched political content on the platform. We ran an ad campaign with two of our videos, showing them to around 1 million YouTube users. Next, we used YouTube’s BrandLift engagement tool to ask people who saw a prebunking video to answer one multiple-choice question. The question assessed their ability to identify a manipulation technique in a news headline. We also had a control group, which answered the same survey question but didn’t see the prebunking video. We found the prebunking group was 5-10% better than the control group at correctly identifying misinformation, showing that this approach improves resilience even in a distracting environment like YouTube.

Our videos would cost less than 5¢ per video view (this would cover YouTube advertising fees). As a result of this study, Google is going to run an ad campaign using similar videos in September 2022. This campaign will be run in Poland and the Czech Republic to counter disinformation about refugees within the context of the Russia-Ukraine war.

When you are trying to build resilience, it is useful to avoid being too direct in telling people what to believe, because that might trigger something called psychological reactance. Reactance means that people feel their freedom to make decisions is being threatened, leading to them digging their heels in and rejecting new information. Inoculation theory is about empowering people to make their own decisions about what to believe.

At times, the spread of conspiracy theories and false information online can be overwhelming. But our study has shown it is possible to turn the tide. The more that social media platforms work together with independent scientists to design, test, and implement scalable, evidence-based solutions, the better our chances of making society immune to the onslaught of misinformation.

4 thoughts on “Is There a Vaccine for Misinformation?”


  1. I used to think that the best vaccine for misinformation was an education but my position changed when I started hearing about people like Ben Carson (educated but politically different). Way back in 2012 I heard a piece on NPR about a new book titled “The Republican Brain” (Chris Mooney) which was being complemented by talking-heads on both the left and right, so I bought a copy (I am a political centrist). That book was based on the results of scientific research showing that “in general” physiological differences in the human brain are responsible for “a lot” of our politics. Apparently Conservatives have a larger amygdala while Liberals have a larger anterior cingulate. Back to 2022: over this past weekend I spent time with some cousins at a family reunion. Some are evangelical Lutherans (one has a Masters degree in Philosophy) and yet it appeared to me that they all engaged in magical thinking on every topic from Trump to Ukraine and Russia to China. Not sure why they gave a fig about Trump (we all live in Canada) so could the problem be social media?


    1. I’ve spent a lot of time with Skeptics groups (legit skeptics, not denialists). While there are a lot of people in those who promote early training in logical fallacies, I want to go upstream of that and teach children as young as they can possibly understand how the human mind works.

      I want everyone to learn up front that the emotional part of the brain (limbic system) makes instantaneous decisions and judgments and the rationales for them come from the cerebral cortex after the fact.

      I want everyone to understand that mental and emotional responses to people’s physical smell, sound and appearance (height, race, facial symmetry, gender) are based on core experiences and associations from infancy (e.g., the human brain is innately racist/tribalist).

      I want everyone to understand how heavily filtered and pre-processed human perceptions are.

      I want everyone to understand how malleable human memory is.

      I want everyone to understand this applies to themselves. No one “gets” it until they regularly recognize it in their own responses and behaviors.


        1. What I glom onto are the active-brain studies that show the sites of brain activity and especially the order of brain activity occurring within a second or so. That is, these are measured confirmation of generally solid theory.

          An fMRI scanner is the ultimate lie-to-yourself detector.

Leave a Reply

Discover more from This is Not Cool

Subscribe now to keep reading and get access to the full archive.

Continue reading