Facebook has banned 3,000 accounts for COVID-19 and vaccine misinformation
It removed more than 20 million individual posts since the start of the pandemic last year.
Since the start of the coronavirus pandemic, Facebook has taken a much tougher stance on health misinformation than it has in the past, removing millions of posts for sharing misinformation. Now, we know just how many accounts, groups and pages have been banned from the platform for repeatedly breaking those rules: 3,000.
Facebook shared the stat as part of its community standards enforcement report, which measures how the company enforces its rules. The number may seem low given the vast amount of misinformation on Facebook about the pandemic and the vaccines.The company also said that more than 20 million pieces of content have been removed and more than 190 million have warning labels between the start of the pandemic in 2020 and this past June.
But the relatively low number of bans — just 3,0000 — tracks with findings by researchers who say that just a few individuals are responsible for the vast majority of vaccine mistruths on social media.
During a call with reporters, Facebook’s VP of Content Policy Monika Bickert, said the company has had to continually evolve its policies, and that it now removes 65 types of vaccine falsehoods, such as posts saying COVID-19 shots cause magnetism. She also noted that some groups have used “coded language” to try to evade the company’s detection, which can pose a challenge.
Facebook’s handling of vaccine misinformation has been in the spotlight in recent months as government officials, including President Joe Biden, have said Facebook should do more to counter mistruths about the COVID-19 vaccines. On its part, Facebook says that vaccine hesitancy has declined by 50 percent in the US, according to its surveys, and that its COVID-19 Information Center has reached 2 billion people.