If it seems like there’s a lot of misinformation about the coronavirus pandemic on Facebook, that’s because there is: Between April and June, the social network says it removed 7 million posts for spreading harmful misinformation about COVID-19. It added labels to an additional 98 million posts, which were deemed false by fact checkers, but didn’t rise to the level of outright removal.
The company released the statistics alongside its community standards enforcement report, which details content takedowns on the social network. Facebook doesn’t typically include misinformation statistics in these reports, but the company has imposed stricter rules for claims about the coronavirus that pose “imminent harm.”
The company removes posts that spread false claims about cures or treatments for COVID-19, as well as other misinformation health organizations say is dangerous. The company has also taken steps to push credible health information, including debunking common rumors about the virus and pushing out PSAs about wearing masks.
Despite these efforts, misinformation about the pandemic has been rampant on both Facebook and Instagram, and the company’s stricter policies have been repeatedly tested.
In May, a viral video that falsely claimed masks make people sick and that the coronavirus was created in a lab racked up millions of views before Facebook removed it. The scenario repeated itself last month when another video, which falsely claimed that the anti-malaria drug hydroxychloroquine was a “cure” for COVID-19, got more than 20 million views in a single day before Facebook took action against it. In both cases, copies of the videos continued to spread across Facebook and Instagram well after the removals began.
Last week, Facebook removed a post from Donald Trump after he claimed that children are “almost immune” from COVID-19. That video was also up for several hours and was viewed millions of times before it was removed.