Skip to main content

Why Social Media Became the Perfect Incubator for Hoaxes and Misinformation

Data scientists are studying how information spreading online influences our social dynamics and what, if anything, can be done to smooth polarization

Chris Malbon

In the summer of 2015 Governor Greg Abbott gave the Texas State Guard an unusual order: keep an eye on the Jade Helm 15 exercise, just in case the online rumors are true. In reality, Jade Helm 15 was a routine eight-week military exercise conducted in Texas and six other states. In the online echo chamber, however, it was something more sinister: the beginning of a coup ordered by President Barack Obama.

One great promise of the creation of the World Wide Web was that users might be exposed to diversified points of view. The human attention span, however, remains limited, and news feed algorithms may favor selective exposure. Users show the tendency to select information that adheres to their beliefs and join polarized groups formed around a shared narrative, called echo chambers. In these closed communities, polarization is dominant. In this environment, unreliable rumors and conspiracy theories spread widely. Conspiracy theories are nothing new, but in an age of rampant populism and digital activism, they have acquired new power to influence real-world events—usually for the worse. In a 2017 report on global risks, the World Economic Forum named the polarization and the viral spread of biased information as one of the most dangerous social trends of the age.With antidemocratic politicians on the rise throughout the West, we are now seeing the danger of viral misinformation become manifest. Our biases make it difficult to discern reliable information when polarization is dominant.And social media platforms have made it easy for ideas—even false ones—to spread around the globe almost instantaneously.

Data scientists have recently made significant progress in understanding the spread and consumption of information, its effect on opinion formation, and the ways people influence one another. Advances in technology have made it possible to exploit the deluge of data from social media—the traces that people use as they choose, share and comment online—to study social dynamics at a high level of resolution.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


By applying the methods of computational social science to the traces that people leave on Facebook, Twitter, YouTube and other such outlets, scientists can study the spread of conspiracy theories in great detail. Thanks to these studies, we know that humans are not, as has long been assumed, rational. Presented with unfiltered information, people will appropriate that which conforms to their own thinking. This effect, known as confirmation bias, fuels polarization and thus decreases the value of information that is true—propping up theories about global mega conspiracies, connections between vaccines and autism, and other nonsense. On top of this, as the COVID-19 pandemic has made clear, the nuances of how science is conducted are not well understood by the public, and concepts such as uncertainty and complexity are difficult to convey. And some scientists are not very good at communicating.

The Echo Chamber

At the University of Venice, my colleagues and I have spent the past year investigating the spread of information and misinformation on social networks. We are especially interested in learning how information goes viral and how opinions are formed and reinforced in cyberspace.

One of our first studies on the subject was designed to reveal how social media users treat three different types of information: mainstream news, alternative news and online political activism. The first category is self-explanatory: it refers to the media outlets that provide nationwide news coverage in Italy. The second category includes outlets that claim to report information that the mainstream media have “hidden.” The final category refers to content published by activist groups that use the Web as a tool for political mobilization.

Gathering information for our study, especially from alternative sources, was time-consuming and painstaking. We collected and manually verified various indicators from Facebook users and groups active in fact-checking. From the 50 Facebook pages that we investigated, we analyzed the online behavior of more than two million Italian users who interacted with those pages between September 2012 and February 2013. We found that posts on qualitatively different topics behaved very similarly online: the same number of users tended to interact with them, to share them on social media and to debate them. In other words, information from major daily newspapers, alternative news sources and political activist sites all reverberate in the same way.

Two different hypotheses could explain this result. The first possibility was that all users treat all information equally, regardless of its veracity. The other was that members of certain interest groups treat all information equally, whether it is true or not, if it reinforces their preexisting beliefs. The second hypothesis was, to us, more interesting. It suggested that confirmation bias plays an important role in the spread of misinformation.

The Manager and the Message

The next step was to test these two hypotheses. We decided to compare the online behavior of people who read science news with that of people who usually follow alternative news and conspiracy theories. We chose these two types of content because of a very specific difference—whether they have a sender, a manager of the message. Science news is about studies published in scientific journals—work by authors and institutions that are known quantities. Conspiracy theories, in contrast, have no sender: they are formulated in a way to promote the uncertainty they generate. The subject is always a secret plan or truth that someone is deliberately concealing from the public.

The very complexity of our world—issues such as multiculturalism, the growing intricacy of the global financial system and technological progress—can lead people, regardless of educational level, to choose to believe compact explanations that clearly identify an object of blame. Martin Bauer, a social psychologist at the London School of Economics and a scholar of conspiracy dynamics, describes conspiracy thinking as a “quasireligious mentality.” It is a little bit like the dawn of humanity, when people attributed storms to acts of divinity.

For this study, which we called “Science vs Conspiracy: Collective Narratives in the Age of Misinformation” and published in PLOS ONE, we investigated 73 Facebook pages, 39 of which trafficked in conspiracy and 34 of which published science news. Altogether, these pages had more than a million Italian users between 2010 and 2014. We found that both sets of pages attracted very attentive audiences—users who rarely leave their echo chambers. People who read science news rarely read conspiracy news, and vice versa. But the conspiracy pages attracted three times more users.

Facebook's news feed algorithm seems to foment the formation of echo chambers, which play an important role in the spread of false rumors. When we investigated 4,709 posts that satirized conspiracy theories (example: “Airplane chemtrails contain Viagra”), we found that consumers of “real” conspiracy news were much more likely to read these satirical pieces than readers of legitimate science news. We also found that users who focus primarily on conspiracy news tend to share content more widely.

When we reconstructed the social networks of our two groups (science news readers and fans of conspiracy theories), we discovered a surprising statistical regularity: as the number of likes for a specific type of narrative increased, the probability of having a virtual social network composed solely of users with the same profile also increased. In other words, the more you are exposed to a certain type of narrative, the greater the probability that all your Facebook friends will have the same news preferences. The division of social networks into homogeneous groups is crucial to understanding the viral nature of the phenomenon. These groups tend to exclude anything that does not fit with their worldview. Such a tendency has been confirmed in more general settings: how people read newspapers on Facebook. Our study, published online in February 2017 in the Proceedings of the National Academy of Sciences USA, analyzing the behavior of 376 million users interacting with news outlet pages on social media, found that the more a user is active online the fewer number of sources the person engages with.

A Wicked Problem

In 2014 we decided to start investigating efforts to correct the spread of unsubstantiated claims on social media. Does debunking work? To find out, we measured the “persistence”—the tendency of a person to continue engaging with a specific type of content over time—of conspiracy news readers who had been exposed to debunking campaigns. People who were exposed to debunking campaigns were 30 percent more likely to keep reading conspiracy news. In other words, for a certain type of user, debunking actually reinforces belief in the conspiracy.

We observed the same dynamics in a study of 55 million Facebook users in the U.S. Users avoid cognitive discord by consuming information that supports their preexisting beliefs, and they share that information widely. Moreover, we found that over time people who embrace conspiracy theories in one domain—say, the (nonexistent) connection between vaccines and autism—will seek out such theories in other domains. Once inside the echo chamber, they tend to embrace the entire conspiracy corpus.

These dynamics suggest that the spread of online misinformation will be very hard to stop. Any attempt at reasoned discussion usually degenerates into a fight between extremists, which ends in polarization. In this context, it is quite difficult to accurately inform people and almost impossible to stop a baseless report. We performed another experiment to measure the effect of different journalistic techniques in smoothing polarization, but those tactics proved to be relatively ineffective.

In all probability, social media will continue to teem with debates on the latest global mega conspiracy. The important thing is to share what is being hidden from us; whether it is true or false hardly matters. Perhaps we should stop calling this the Information Age and start calling it the Age of Segregation.

MORE TO EXPLORE

Anatomy of News Consumption on Facebook. Ana Lucía Schmidt et al. in Proceedings of the National Academy of Sciences USA, Vol. 114, No. 12, pages 3035–3039; March 21, 2017.

Debunking in a World of Tribes. Fabiana Zollo et al. in PLOS ONE, Article 0181821; July 24, 2017.

Social and Political Challenges. Part 2: The Global Risks Report 2017. 12th edition. World Economic Forum, 2017. Available at http://reports.weforum.org/global-risks-2017

Echo Chambers on Social Media: A Comparative Analysis. Matteo Cinelli et al. Submitted to arXiv.org April 20, 2020. Preprint available at arxiv.org/abs/2004.09603

Measuring Social Response to Different Journalistic Techniques on Facebook. Ana L. Schmidt et al. in Humanities and Social Sciences Communications, Vol. 7, No. 17, 7 pages; July 1, 2020.

Walter Quattrociocchi coordinates the Laboratory of Data Science and Complexity at the University of Venice. His research focuses on the quantitative characteristics of social dynamics, from how opinions are formed to the way information spreads. Quattrociocchi is particularly interested in the origin, production, and spread of online narratives and social contagion.

More by Walter Quattrociocchi
Scientific American Magazine Vol 316 Issue 4This article was originally published with the title “Inside the Echo Chamber” in Scientific American Magazine Vol. 316 No. 4 (), p. 60
doi:10.1038/scientificamerican0417-60