In Case you Wondered – Yes, the Internet Selects for Bullshit

April 18, 2018

And it could get worse if we don’t wise up fast.

MIT News:

A new study by three MIT scholars has found that false news spreads more rapidly on the social network Twitter than real news does — and by a substantial margin.

“We found that falsehood diffuses significantly farther, faster, deeper, and more broadly than the truth, in all categories of information, and in many cases by an order of magnitude,” says Sinan Aral, a professor at the MIT Sloan School of Management and co-author of a new paper detailing the findings.

“These findings shed new light on fundamental aspects of our online communication ecosystem,” says Deb Roy, an associate professor of media arts and sciences at the MIT Media Lab and director of the Media Lab’s Laboratory for Social Machines (LSM), who is also a co-author of the study. Roy adds that the researchers were “somewhere between surprised and stunned” at the different trajectories of true and false news on Twitter.

Moreover, the scholars found, the spread of false information is essentially not due to bots that are programmed to disseminate inaccurate stories. Instead, false news speeds faster around Twitter due to people retweeting inaccurate news items.

“When we removed all of the bots in our dataset, [the] differences between the spread of false and true news stood,”says Soroush Vosoughi, a co-author of the new paper and a postdoc at LSM whose PhD research helped give rise to the current study.

The study provides a variety of ways of quantifying this phenomenon: For instance,  false news stories are 70 percent more likely to be retweeted than true stories are. It also takes true stories about six times as long to reach 1,500 people as it does for false stories to reach the same number of people. When it comes to Twitter’s “cascades,” or unbroken retweet chains, falsehoods reach a cascade depth of 10 about 20 times faster than facts. And falsehoods are retweeted by unique users more broadly than true statements at every depth of cascade.

The bottom-line findings produce a basic question: Why do falsehoods spread more quickly than the truth, on Twitter? Aral, Roy, and Vosoughi suggest the answer may reside in human psychology: We like new things.

“False news is more novel, and people are more likely to share novel information,” says Aral, who is the David Austin Professor of Management. And on social networks, people can gain attention by being the first to share previously unknown (but possibly false) information. Thus, as Aral puts it, “people who share novel information are seen as being in the know.”

The MIT scholars examined this “novelty hypothesis” in their research by taking a random subsample of Twitter users who propagated false stories, and analyzing the content of the reactions to those stories.

The result? “We saw a different emotional profile for false news and true news,” Vosoughi says. “People respond to false news more with surprise and disgust,” he notes, whereas true stories produced replies more generally characterized by sadness, anticipation, and trust.

So while the researchers “cannot claim that novelty causes retweets” by itself, as they state in the paper, the surprise people register when they see false news fits with the idea that the novelty of falsehoods may be an important part of their propagation.

Directions for further research

While the three researchers all think the magnitude of the effect they found is highly significant, their views on its civic implications vary slightly. Aral says the result is “very scary” in civic terms, while Roy is a bit more sanguine. But the scholars agree it is important to think about ways to limit the spread of misinformation, and they hope their result will encourage more research on the subject.

On the first count, Aral notes, the recognition that humans, not bots, spread false news more quickly suggests a general approach to the problem.

“Now behavioral interventions become even more important in our fight to stop the spread of false news,” Aral says. “Whereas if it were just bots, we would need a technological solution.”

And then there’s YouTube.


There is a vast network of conspiracy videos on YouTube that feeds off tragic events — including the recent shooting in Parkland, Florida — according to a prominent misinformation researcher.

Jonathan Albright, research director for the Tow Center for Digital Journalism at Columbia University, has studied misinformation on YouTube going back to the 2016 presidential election. But even he was surprised by what he found after mapping out the videos that YouTube suggested alongside videos alleging conspiracies connected to the shooting at Marjory Stoneman Douglas High School.

“I didn’t expect to be shocked when I looked at the results,” Albright wrote in a Medium post published Sunday.

What surprised Albright wasn’t the existence of conspiracy videos. The internet has served as a platform for the paranoid to share their thoughts since its earliest days, a role that first caught wider notice after theories about the 9/11 terrorist attacks reached a growing audience.

Instead, Albright focused on the finding that the many thousands of conspiracy videos that he could identify were being pushed by YouTube’s “Up next” recommendations, which persuade users to stay on the site and watch more videos.

He found that the series of videos served up by the Google-owned platform created a self-reinforcing network that grew stronger as more tragic events occurred.

“Every time there’s a mass shooting or terror event, due to the subsequent backlash, this YouTube conspiracy genre grows in size and economic value,” Albright wrote. “The search and recommendation algorithms will naturally ensure these videos are connected and thus have more reach.”




7 Responses to “In Case you Wondered – Yes, the Internet Selects for Bullshit”

  1. Sir Charles Says:

    And old phenomenon super boosted by the new media. And the media owners don’t care as long as they can catch the most possible amount of hits… Is that the rope the system is hanging itself?

  2. botterd Says:

    Not surprising, [1] false new will more often appear to be a remarkable revelation than simple facts, and [2] people who retweet stuff are not thinking about their job, their editors and the damage to the reputation of the organization should it turn out to be (partly) untrue or misleading.

    That said, the MSM decides whether anything is worth pursuing or newsworthy, and this selection bias is a far greater power than is the reporting untrue facts. The established news will counter that they select news on the basis of newsworthiness and audience interest, but of course they would maintain that.

    In addition, conspiracies are as old as the world, whether we are talking about village gossip, office politics, crime, or the million people involved in the ” secret” parts of the government. The credibility of officials is too often taken at face value, and in many things, it remains to be seen what the real story is, not just the real facts, but also the real story behind them. It is not that easy to determine if news is being reported truthfully or not, especially in relation to things that matter.

  3. KeenOn350 Says:

    Nothing new here, except the number of people that can be reached –

    “A lie gets halfway around the world before the truth has a chance to get its pants on.” – Winston Churchill

  4. rhymeswithgoalie Says:

    “We found that falsehood diffuses significantly farther, faster, deeper, and more broadly than the truth, in all categories of information, and in many cases by an order of magnitude,” says Sinan Aral, a professor at the MIT Sloan School of Management and co-author of a new paper detailing the findings.
    Published falsehoods are usually *constructed* to be more attention-worthy, whether to reinforce people’s biases or capture the imagination. “Dog bites man” won’t give you a headline.

    I’m reminded of Big Bang Theory’s Sheldon Cooper and Amy Farrah-Fowler constructing a gossip-spreading experiment by having Amy mention that she (a) is thinking of starting an herb garden, and (b) had sex with Sheldon.

  5. redskylite Says:

    A big problem, made bigger with relatively recent domestic computerisation and instant communication, opinion can be influenced by vested groups who study the dynamics. I’m amazed by what get shared on social media and by whom, and how quickly the human makes judgement, maybe part of our ancient survival mechanisms.

    A interesting report in Carbon Brief on how possible futures could influence our path choices. Ring any bells of worry/alarm/concern, it does for me ?

    “Explainer: How ‘Shared Socioeconomic Pathways’ explore future climate change”

    Over the past few years, an international team of climate scientists, economists and energy systems modellers have built a range of new “pathways” that examine how global society, demographics and economics might change over the next century. They are collectively known as the “Shared Socioeconomic Pathways” (SSPs).

    They show that it would be much easier to mitigate and adapt to climate change in some versions of the future than in others. They suggest, for example, that a future with “resurgent nationalism” and a fragmentation of the international order could make the “well below 2C” Paris target impossible.

  6. redskylite Says:

    If you are making a science based video and want it to impress, the sound quality is paramount

    The quality of audio influences whether you believe what you hear
    The findings are significant amid the recent rise of fake news and public distrust in science, says USC’s Norbert Schwarz

Leave a Reply to rhymeswithgoalie Cancel reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: