Facebook, Google and Twitter brace for coronavirus vaccine misinformation

It’s not yet clear if the companies will change their policies, though.

Facebook, Twitter and Google are gearing up for the next big misinformation fight. The companies are joining forces with fact checkers and government agencies from around the world to combat misinformation about the upcoming coronavirus vaccines. Under a new partnership, the social media platforms will work with researchers and government agencies in the UK and Canada to create a framework for responding to anti-vaccine misinformation during the pandemic.

“With a coronavirus vaccine now potentially just months away, a wave of related bad information could undermine trust in medicine when it matters most,” Full Fact writes in a press release. “This project is an attempt to learn the lessons of previous waves of bad information, whether during elections or pandemics, to make sure we’re all ready to contain the next crisis before it unfolds.”

Facebook, Google and Twitter are all participating in the effort, along with fact-checking organizations in the US, UK, India, Spain, Argentina and Africa. Also involved: the UK’s Department for Digital, Culture, Media and Sport and Canada’s Privy Council Office (notably, the US government is not participating, at least for now). According to Full Fact, the group will create “standards of accountability for tackling misinformation,” and develop a shared framework for countering “bad information.”

What’s not clear is just how much influence groups like this will ultimately have over social media companies’ policies. Though social media platforms have taken some steps to tamp down on anti-vaccine conspiracy theories, they have a messy history with the subject.

For example, Facebook recently announced a ban on ads that discourage vaccines, and has promoted PSAs for flu shots. But the company has allowed vaccine conspiracy theories to spread on Instagram and in groups on Facebook. And, as recently as September, Mark Zuckerberg defended his decision not to crack down on the platform’s anti-vaxxers.

Likewise, YouTube recently banned misinformation about COVID-19 vaccines, but said it would decline to remove videos that merely express ”broad concerns.” Twitter has also taken steps to promote credible information about vaccines and the pandemic, but hasn’t cracked down on specific vaccine conspiracy theories.

It’s also worth considering that fact-checking alone may not be enough. Misinformation about vaccines has been rampant on social media for years despite previous efforts to discourage it. And proponents of these conspiracy theories are gain a foothold in other online communities, like parenting groups or “natural health” enthusiasts. This has led some experts to warn that fact-checking alone is unlikely to be effective without other interventions. In a recent report on misinformation about a coronavirus vaccine, First Draft, a nonprofit that researches misinformation on social media, noted that fact-checking and moderation can be counterproductive.

We need to stop relying on fact-checking efforts and platforms’ content moderation policies to address data deficits. Doing so is reactive, insufficient and potentially counterproductive. For example, greater levels of content moderation could fuel anti-vaccination narratives that claim platforms are attempting a cover-up. They also could encourage key vaccine communities to migrate to alternative platforms that are harder to monitor and research. Proactive messaging that is both compelling and tailored to different audiences is needed.

Instead, they say, companies should take an approach that provides reliable information where it’s needed, closely watch groups likely to spread conspiracy theories and promote ways for people with concerns to connect with health experts.

Those all sound like measures these companies could adopt under the partnership with Full Fact. But with the first coronavirus vaccine now on the horizon, they might need to act sooner than later.