Skip to main content

When a Journalist Becomes a Disinformation Agent

Simulation games help newsrooms prepare for covering a chaotic election season

Credit:

Hanna Barczyk

I am not the editor in chief of a propaganda farm disguised as a far-right breaking news outlet. But one day last February, just before the world shut down, I got to play one.

About 70 journalists, students and digital media types had gathered at the City University of New York to participate in a crisis simulation. The crisis at hand was the 2020 U.S. presidential election. The game was designed to illuminate how we, as reporters and editors, would respond to a cascade of false and misleading information on voting day—and how public discourse might respond to our coverage. The exercise was hosted by First Draft, a research group that trains people to understand and outsmart disinformation.

After a morning workshop on strategies for reporting on conspiracy theories and writing headlines that don’t entrench lies, the organizers split us up into groups of about 10 people, then gave each “newsroom” a mock publication name. Sitting around communal tables, we assigned ourselves the roles of reporters, editors, social media managers and a communications director. From our laptops we logged into a portal to access the game interface. It looked like a typical work desktop: There was an e-mail inbox, an intraoffice messaging system that functioned exactly like Slack, a microblogging platform that worked exactly like Twitter and a social feed that looked exactly like Facebook. The game would send us messages with breaking events, press releases and tips, and the feeds would respond to our coverage. Several First Draft staffers at a table were the “communications desk,” representing any agency, person or company we might need to “call” to answer questions. Other than that, we received no instruction.

My newsroom was mostly made up of students from C.U.N.Y.’s Craig Newmark Graduate School of Journalism and other local universities. The organizers gave us a few minutes to define our newsrooms’ identities and plan our editorial strategies. The room filled with nervous murmurings of journalists who wanted to fight the bad guys, to beat back misinformation and safeguard election day with earnest, clear-eyed coverage. But I had a different agenda, and I was the one in charge.

“Sorry, team,” I said. “We’re going rogue.”

SIMULATIONS should include extreme scenarios if they are to properly scare people into preparing for the unexpected—into updating protocols and rearranging resources or tripping certain automated processes when things go awry. Yet journalists and scientists tend to resist engaging with the outlandish. We dismiss sensational outcomes, aiming to wrangle expectations back into the realm of reason and precedent. In recent years that strategy has often left us reeling. A Nature article published this past August explained why the U.S. was caught flat-footed in its response to COVID-19: despite the fact that government officials, academics and business leaders have participated in dozens of pandemic simulations over the past two decades, none of the exercises “explored the consequences of a White House sidelining its own public health agency,” wrote journalists Amy Maxmen and Jeff Tollefson.

The success of any scenario game, then, depends on the questions it raises. The game doesn’t need to predict the future, but it does need to pry players away from the status quo, to expand their sense of what is possible. And to stress-test the preparedness of a newsroom on November 3, 2020, things needed to get weird.

Disinformation scholars often warn that focusing on the intent of influence operations or the sophistication of their techniques overestimates their impact. It’s true that many disinformation tactics are not robust in isolation. But the targeted victim is fragile; pervasive anxiety and a deep social divide in America make us vulnerable to attacks from afar and within. And because it’s cheap and easy for bad actors to throw proverbial spaghetti at social feeds, occasionally something sticks, leading to massive amplification by major news organizations. This was my goal as an editor in chief of unreality.

The simulation started off slowly. A tip came in through e-mail: Did we see the rumor circulating on social media that people can vote by text message?

As other newsrooms began writing explainers debunking SMS voting, I assigned a reporter to write a “tweet” that would enhance confusion without outright supporting the lie. After a quick edit, we posted: We’re hearing that it’s possible to vote by text message. Have you tried to vote by SMS? Tell us about your experience! It went up faster than any other content, but the social Web reacted tepidly. A couple of people called us out for spreading a false idea. So we dug in with another post: Text message voting is the way of the future—but Democrats shut it down. Why are elites trying to suppress your vote? Story coming soon!

We continued this pattern of baseless suggestions, targeted at whatever people on the feed seemed to already be worried or skeptical about. Eventually some of the other newsrooms caught on that we might not be working in good faith. At first they treated our manipulations as myths to debunk with fact-laden explainers. But our coverage kept getting dirtier. When an editor from a respectable outlet publicly questioned the integrity of my senior reporter, I threatened to take legal action against anyone who maligned her. “We apologize to no one!” I yelled to my team.

My staff was having fun wreaking havoc. The social platforms in the game were controlled by First Draft organizers (who, I later learned, meted out eight “chapters” of preloaded content), as well as manual input from the simulation participants in real time. We watched the feeds react with more and more outrage to the “news” we published. Our comms director stonewalled our competitors, who kept asking us to take responsibility for our actions, even forming a coalition to call us out.

Then a new tip appeared: someone on social media said there was an active shooter at her polling place. Everyone’s attention shifted. The first newsroom to get a comment from the “local police” posted it immediately: At this time, we are not aware of any active shooting threat or event. We are investigating. While other teams shared the message and went to work reporting, I saw a terrible opening in the statement’s inconclusiveness. “Let’s question the integrity of the cops,” I whispered maniacally to my team.

We sent out a post asking whether the report could be trusted. In a forest of fear, the suggestion that voters were at risk from violence was a lightning bolt. Social media lit up with panic. A celebrity with a huge following asked her fans to stay safe by staying home. My newsroom quietly cheered. We had found an editorial focus, and I instructed everyone to build on it. We “tweeted” a dozen times, occasionally promising an in-depth story that never arrived.

Once we were on a roll, I paused to survey the room. I watched the other teams spending all their energy on facts and framing and to-be-sures, scrambling to publish just one article debunking the misleading ideas we had scattered like dandelion seeds. We didn’t even need to lie outright: maybe there was an active shooter! In the fog of uncertainty, we had exploited a grain of possible truth.

ABRUPTLY, the organizers ended the game. Ninety minutes had somehow passed.

I took stock of myself standing up, leaning forward with my hands pressed to the table, adrenaline rippling through my body. I had spent the previous year researching digital disinformation and producing articles on its history, techniques and impact on society. Intellectually I knew that people and groups wanted to manipulate the information environment for power or money or even just for kicks. But I hadn’t understood how that felt.

I scanned the faces of my “colleagues,” seeing them again as humans rather than foot soldiers, and flinched at the way they looked back at me with concern in their eyes.

Our debrief of the simulation confirmed that my newsroom had sabotaged the media environment on Election Day.“You sent the other newsrooms into a tailspin,” First Draft’s deputy director Aimee Rinehart later told me. She said I was the first person to co-opt the game as a “bad steward of the Internet,” which made me wonder if future simulations should always secretly assign one group the role of wily propagandist.

It took hard alcohol and many hours for my nervous system to settle down. The game had rewarded my gaslighting with amplification, and I had gotten to witness the spread of my power, not just in likes and shares but through immediate “real-world” consequences.

Playing the bad guy showed me how the design of platforms is geared toward controlling minds, not expanding them. I’d known this, but now I felt why journalism couldn’t compete against influence operations on the high-speed battlefield of social media—by taking up the same arms as the outrage machine, we would become them. Instead we could strengthen our own turf by writing “truth sandwich” headlines and service articles that anticipate the public’s need for clarity. Because ultimately the problem wasn’t about truths versus lies or facts versus falsehoods. It was about stability and shared reality versus disorientation and chaos. And in that day’s simulation of the 2020 election, chaos had won by suppressing the vote.

Jen Schwartz is a senior features editor at Scientific American since 2017. She produces stories and special projects about how society is adapting--or not--to a rapidly changing world, particularly in the contexts of climate change, health, and misinformation. Jen has led several editorial projects at Scientific American, including a special issue, "How Covid Changed The World" (March 2022); the "Confronting Misinformation" special report (November 2020); and "The Future of Money" special report (January 2018), for which she was interviewed in over a dozen media outlets including CNBC, CBS and WNYC. She's co-led projects including the "Truth, Lies, and Uncertainty" special issue (2019) and "Inconceivable" (2018) about research gaps in female reproductive health. Jen also writes and edits essays and book reviews for Scientific American. For 15 years, Jen has reported on sea-level rise and the vexing choices of coastal communities. In 2016, she flew with NASA's Operation Icebridge over Antarctica to report on how polar observations of ice melt lead to ever-improving models for sea-level rise; her resulting feature story, about how a community in NJ is retreating from worsening floods, won the 2019 "Science in Society Award" from the National Association of Science Writers. It has been widely cited in policy and academia, and she has discussed her work on radical climate adaptation at places including the World Economic Forum's Sustainable Development Summit, Telluride Mountainfilm festival, PBS's Story in The Public Square, The Denver Museum of Natural History, and Princeton University's Council on Science and Technology. Jen has moderated panel discussions for a range of audiences, from corporate (3M's State of The World's Science), to global development (UN General Assembly), to government (Earth From Space Institute) to the arts (Tribeca Film Festival). Jen previously worked at Popular Science, GQ, New York Magazine, Outside, and The Boston Globe. She has an B.S. in journalism from the College of Communication at Boston University.

More by Jen Schwartz
Scientific American Magazine Vol 323 Issue 5This article was originally published with the title “Power Play” in Scientific American Magazine Vol. 323 No. 5 (), p. 39
doi:10.1038/scientificamerican1120-39