Misinformation is a Public Health Crisis —So Let’s Treat it That Way

The fast transmission of bad information is dangerous. A new approach employs “infodemiologists” to fight outbreaks.

On April 23, 2020, during a White House briefing on the novel coronavirus, former president Donald Trump had a brief sip of inspiration followed by a dangerous gush of misinformation.

“And then I see the disinfectant, where it knocks it out in a minute,” Trump said at a press conference. “One minute,” the president said. “And is there a way we can do something like that, by injection inside—or almost a cleaning?” 

After a major backlash from multiple media and medical establishments, Trump backpedalled the next day, saying he was simply being sarcastic. 

But it was too late. 

Already that day, the New York Times reported: “In Maryland, so many callers flooded a health hotline with questions that the state’s emergency management agency had to issue a warning that ‘under no circumstances’ should any disinfectant be taken to treat the coronavirus. In Washington State, officials urged people not to consume laundry detergent capsules. Across the country on Friday, health professionals sounded the alarm. Even the makers of Clorox and Lysol pleaded with Americans not to inject or ingest their products.”

The next month the Centers for Disease Control and Prevention (CDC) conducted a survey of Americans’ use of disinfectants prompted by the increased calls to poison centers since the onset of COVID-19. Thirty-nine percent of Americans intentionally engaged in at least one related high-risk practice—with 10 percent misting the body with disinfectant spray, 19 percent bleaching food items, six percent inhaling vapor from household disinfectants, 4 percent drinking or gargling bleach, and another 4 percent ingesting or rinsing their mouths with other cleaning and disinfectant solutions.

This is how an infodemic infects a nation.

David Scales, an assistant professor at Weill Cornell Medicine and chief medical officer of Critica, a new nonprofit combatting scientific misinformation, is recruiting and deploying an army of “infodemiologists” to detect and fight online outbreaks of misinformation. In his arsenal: science, the very thing he is protecting. 

The team at Critica, funded by the Robert Wood Johnson Foundation, the largest philanthropy dedicated solely to health in the United States, has developed a protocol that integrates the science of myth-busting. The protocol includes strategies to engage with people who share misinformation, such as where and when to intervene, how to de-escalate arguments, appropriate initial responses, motivational interviewing, how to offer and repeat correct information, and techniques for building confidence in executing new behaviors and group membership.

“Infodemiologists will be new public health champions that can be deployed to any infodemic of misinformation.”

“Infodemiologists will be new public health champions that can be deployed to any infodemic of misinformation,” Scales says, to figure out ways to slow or halt that transmission. “Thinking about misinformation as an infodemic suggests that [misinformation] is communicable—it transmits from one person to another.” 

With the massive amount of misinformation out there, Scales and his team are always thinking about scalability and sustainability. “A lone infodemiologist does nothing. Their interventions are a drop in the ocean,” he says. 

How will it work? 

First, they will focus on scale by training as many infodemiologists as possible. Taking a cue from the Epidemic Intelligence Service (EIS), the CDC’s model of training a large number of epidemiologists at the state level, who then monitor and report local outbreaks, a cadre of “disease detectives” would be ready to respond to any possible outbreak of bad information when it arises. This isn’t their full-time job. Instead, Critica plans to train infodemiologists within professional scientific societies. 

Once they train a critical mass of people able to intervene positively and professionally, their hope is that they will help to elevate society’s discourse on social media. Finally, they plan to develop trust over time, with infodemiologists seen as a consistent presence online who are regarded as trusted public figures in their communities.

“It takes time to build a reputation, especially online, so this isn’t—and shouldn’t be seen as—a flash-in-the-pan solution. This is the start of a long, slow, hard slog out of the pit of misinformation that social media platforms have gotten us into,” says Scales. 

With a problem this complex, there is no magic pill to cure misinformation. Instead, with his doctorate in sociology and a medical degree, Scales integrates science with society in a “social ecological model of health” to combat infodemics on three levels. 

One is to target individual behaviors, such as our tendency to share clickbait without even reading the article, questioning its accuracy, or taking ownership for endorsing its contents. 

Second is to reinforce norms at a community level. “In the circles I run in, posting clickbait or even bringing it up in conversation in a non-ironic way is frowned upon,” Scales says. “In other circles, people posting such attention-getting content on Facebook may get positive reinforcement from their community. So they keep doing it.”

“You don’t want to be debunking misinformation that nobody’s talking about because then all you’re doing is spreading the misinformation.”

Third is to intervene at the structural level, by adopting laws, policies, and regulations for social media platforms that achieve systemic changes. 

Watchdog organizations can play a role—as they already do. Scales’ infodemiologists work in partnership with the Annenberg Public Policy Center, whose surveillance system gives it weekly updates on what and where misinformation is circulating. Not all misinformation goes viral, and when and if it does, it doesn’t always do so immediately, Scales says.

“You don’t want to be debunking misinformation that nobody’s talking about because then all you’re doing is spreading the misinformation,” Scales says. This is something known as the “illusory truth effect,” in which we come to believe false information because it’s repeated again and again.

An ounce of prevention

Using the infectious disease analogy of the infodemic, Scales’ team provides “pre-exposure prophylaxis” to vaccinate people against their susceptibility to misinformation. The coveted vaccine for infodemics is teaching critical-thinking skills and discernment of high-quality information from misinformation, he says, which may be difficult to do on a societal level.

Luckily he is not alone in the fight. 

One ally is Timothy Caulfield, the director of the Health Law Institute at the University of Alberta. A well-known Canadian health myth-buster with a long list of bestsellers, a large Twitter following, and a Netflix show on fighting misinformation, Caulfield uses his platforms to offer tools and strategies to immunize ourselves against misinformation. 

In his recent book, Your Day, Your Way: The Fact and Fiction Behind Your Daily Decisions, he teaches the reader how to discern the quality of research between accurate science and pseudoscience and take note of the amount of evidence that supports any given topic. For instance, he warns not to be persuaded by anecdotes, which hijack our brains out of their high-road thinking by eliciting emotions and lack both quality and quantity of research. He also coaches the public to consider the source of information, including any inherent biases and conflicts those sources may have.

Gordon Pennycook, an assistant professor of behavioral science at the University of Regina in Saskatchewan, is another ally in the fight against misinformation. His studies and many others show that getting participants just to think about how accurate a news piece is—regardless of the depths of their underlying beliefs—improves their discrimination of what content they later share. Similarly, asking probing questions to make people reflect on a news article—and thus slowing them down—improves resistance to misinformation.

When Pennycook tested the effects of different placements of accuracy warnings—before, during, or after someone reads a news article—those who read a warning at the end of the news remembered the true accuracy better one week later. “If you read a thing and you’re like, ‘Ah, that makes sense,’ and then you see afterwards that it’s false, that’s a more memorable experience.” Major social media outlets have yet to implement this, and Pennycook hasn’t yet heard of any interest.

“People are actually pretty good at distinguishing between true and false news content when you ask them about it,” Pennycook says. 

The public is also more trusting of crowdsourced fact-checking than professional fact checkers, whom they view as more biased. Errors such as political biases or any random mistakes will tend to cancel out. “If you get people to give the average height of a redwood tree, no one really knows what the answer is. But if you’ve got a thousand people to guess the average height, it ends up being pretty close to what’s true,” he says.

Go Deeper