Skip to main content

Attempts at Debunking “Fake News” about Epidemics Might Do More Harm Than Good

Batting down conspiracy theories about disease outbreaks such as that of the new coronavirus may prove counterproductive to public health efforts

Conspiracy theory falsely blamed genetically modified mosquitoes—their larvae shown here in a laboratory in Brazil—for a Zika outbreak in that country.

Was it a bioweapon from a virology institute? Had it been known before and already patented? Could homeopathic remedies help? All of these ideas about the headline-making novel coronavirus disease—now officially called COVID-19—are blatantly false. As with any recent outbreak, from Zika to Ebola, untruths and conspiracy theories spread as quickly as the pathogen itself.

An emerging line of research exploring what might be called misinformation studies is trying to understand how and why fake beliefs arise during public health crises. Media coverage of the new coronavirus is still unfolding and has not yet been rigorously analyzed. But a study of two earlier epidemics that arrived just as new reports about COVID-19 continued to mount reveals the difficulty in reversing false rumors about a health crisis.

Researchers at Dartmouth College, IE University in Spain and other institutions conducted social science experiments showing that attempts to counter false beliefs about the Zika virus with information from the World Health Organization were often counterproductive: the debunking failed to lower misperceptions and even reduced respondents’ confidence in accurate information about the epidemic of the pathogen. The study appeared in Science Advances on January 29.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


The Zika virus, which can cause birth defects, including microcephaly (a condition in which babies are born with an abnormally small head) and other neurological complications, spurred a series of conspiracy theories in Brazil when the 2015–2016 outbreak occurred. A pesticide that was erroneously believed to cause microcephaly was even banned. “Emerging diseases are a kind of primordial swamp for conspiracy theories that come out of them,” says study co-author Brendan Nyhan, a professor of government at Dartmouth. His paper points out that false beliefs help people to reverse feelings of not having control over a situation. These feelings crop up when “there's a novel threat in the environment and a lack of factual information about the sources of the threat and how to best protect yourself,” Nyhan says. “And in that context, people will often grasp for simple explanations of the threat that may be more intuitive or less psychologically discomforting than the messy, chaotic, random reality of emerging diseases that aren't always easy to understand.”

The study began with a face-to-face survey of 1,532 Brazilians to assess the extent of their misperceptions about Zika. Most respondents accurately answered that the virus is transmitted by mosquitoes and is not contracted by casual contact. But a large number of them also asserted false ideas: More than 63 percent incorrectly believed that genetically modified (GM) mosquitoes spread the disease. And more than half mistakenly thought that the increase in microcephaly cases came about because of childhood vaccinations or a chemical used against the larvae of mosquitoes that transmit Zika to humans.

The survey was followed by randomized online social science experiments in 2017 and 2018 that probed how people reacted when they were told that the beliefs they held about GM mosquitoes were incorrect. The “myths correction message,” based on information from the WHO, did not diminish the credibility of the conspiracy theories for these respondents, compared with views held by a control group. What is more, the correction had a “spillover effect” that significantly reduced people’s beliefs in six of nine accurate facts about the epidemic. The researchers suggest that the reason debunking failed to work may relate to what is called the tainted truth effect: the act of warning the public that previously learned information is inaccurate can increase skepticism about other disease-related knowledge—even if it is correct.

The study also included a separate 2018 experiment in response to a yellow fever epidemic in Brazil. That investigation had better results in using corrective information to shift attitudes—perhaps because the disease was more familiar to Brazilians. But it still failed to bolster support for policies to control mosquitoes or to take preventive measures.

This new paper may point toward strategies that avoid an overemphasis on the role of furnishing debunking information in public health campaigns in favor of simple messages about the best measures to be adopted—a lesson that Nyhan says he derives from his research on public attitudes toward childhood vaccinations. “The more effective approach is to work in the community through trusted institutions and leaders to build trust and communicate the importance of vaccinations to public health,” he says.

Adam Berinsky, a professor of political science and director of the Massachusetts Institute of Technology’s political experiments research lab, who was not involved in the study, says it “sheds important light on the limitations of corrective strategies in the realm of public health. The authors tested real-word health appeals and, sadly, found them [to be] of limited effectiveness—especially in attempts to encourage preventative behavior.” He adds that “the results of the study might be disheartening, but in the realm of misinformation, it is as important to figure out what existing programs don’t work as it is to figure out what programs might work.”

Emily Vraga, an associate professor of health communication at the University of Minnesota, who was also not involved in the work, has found that corrective messaging can, in fact, be effective in changing attitudes in her own research with Leticia Bode of Georgetown University. But she praises the quality of the Zika study, even while wondering whether more explicit cues about the source of the information provided to respondents might have helped change their views. “I was surprised and disappointed to see that the correction efforts based on the WHO’s efforts to dispel rumors surrounding the Zika virus were not only ineffective but, in fact, may have been counterproductive,” Vraga says. “I think there’s a good chance that the results would apply to other emerging health issues, such as the [new] coronavirus, as many features of the [epidemic] are similar: a relatively unknown disease, incredibly fast spread and quickly changing evidence regarding [its] effects.”

The extent to which false beliefs can be corrected will require further studies of different epidemics. “There are other results showing that people are sensitive to corrections of conspiracy theories,” says Stephan Lewandowsky, a professor of cognitive science at the University of Bristol in England, who was not part of Nyhan’s paper. “But at the moment, we don’t know for sure when this would apply and when it would not.”

Gary Stix, the neuroscience and psychology editor for Scientific American, edits and reports on emerging advances that have propelled brain science to the forefront of the biological sciences. Stix has edited or written cover stories, feature articles and news on diverse topics, ranging from what happens in the brain when a person is immersed in thought to the impact of brain implant technology that alleviates mood disorders like depression. Before taking over the neuroscience beat, Stix, as Scientific American's special projects editor, oversaw the magazine's annual single-topic special issues, conceiving of and producing issues on Einstein, Darwin, climate change and nanotechnology. One special issue he edited on the topic of time in all of its manifestations won a National Magazine Award. Stix is the author with his wife Miriam Lacob of a technology primer called Who Gives a Gigabyte: A Survival Guide to the Technologically Perplexed.

More by Gary Stix