Following similar moves by YouTube and TikTok, Facebook has pledged to remove misinformation about COVID-19 vaccines. As health agencies across the world start to approve vaccines, the company said false claims about them that have been debunked by public health experts will be deleted from Facebook and Instagram.
This is a stricter enforcement of a policy that targets misinformation about COVID-19 that may “lead to imminent physical harm," such as the promotion of false cures. Facebook gave examples of the incorrect claims it will remove, such as ones suggesting vaccines contain microchips, or those about the "safety, efficacy, ingredients or side effects of the vaccines."
Facebook said it won't be able to fully enforce these measures right away. "Since it’s early and facts about COVID-19 vaccines will continue to evolve, we will regularly update the claims we remove based on guidance from public health authorities as they learn more," it wrote in a blog post. Facebook noted it's providing details about vaccines from "authoritative sources of information" via the COVID-19 Information Center.
The company previously banned anti-vaccine ads at a broader level. In the past, Facebook has removed false claims about the polio vaccine in Pakistan and the measles vaccine in Samoa.
Meanwhile, Facebook is seemingly evolving its moderation practices on another front. Documents obtained by the Washington Post suggest the company is in the early stages of reworking its algorithms that detect hate speech. Facebook is reportedly trying to make its systems better at locating and automatically removing the “worst of the worst” of hateful language. That’s said to include slurs directed at people of color, the LGBTQ community and Jewish people.