Advertisement

Facebook removes hundreds of QAnon groups

The company isn’t banning the conspiracy theory outright, but is trying to limit its spread.

The Washington Post via Getty Images

Facebook is finally cracking down on QAnon: after weeks of pressure, the company says it has removed hundreds of groups and pages, and blocked thousands of ads tied to the far-right conspiracy theory. The move is the social network’s biggest effort to take on the movement, which the FBI warned could pose a domestic terror threat.

Under its new policy, Facebook isn’t banning QAnon or its supporters entirely, but is cracking down on those that “discuss potential violence,” and taking steps to make other QAnon accounts and content less visible. Facebook will also block QAnon accounts from running ads, selling products or using other monetization features. “We’ve removed over 790 groups, 100 Pages and 1,500 ads tied to QAnon from Facebook, blocked over 300 hashtags across Facebook and Instagram, and additionally imposed restrictions on over 1,950 Groups and 440 Pages on Facebook and over 10,000 accounts on Instagram,” Facebook wrote in a statement.

QAnon’s reach has exploded since the start of the coronavirus pandemic, and Facebook’s own algorithms, which recommended QAnon content on Facebook and Instagram, have been blamed for helping fuel its rise. And the company has faced heavy criticism for not doing more to prevent QAnon content from going viral on its platform. Facebook’s actions comes weeks after Twitter banned the group and nearly two years after Reddit booted QAnon off its platform.

Facebook’s approach is slightly different: Instead of trying to root out the conspiracy entirely, the company is taking steps to prevent it from continuing to go viral on its services. Facebook says it will prevent pages, groups and Instagram accounts from appearing in its algorithmic recommendations, reduce their distribution in News Feed and rank accounts lower in search results. The company notes that on Instagram it has “temporarily removed” the “related hashtags” feature, which has been a major source QAnon recommendations.

In addition to QAnon, the new rules will also apply to “offline anarchist groups that support violent acts amidst protests” — an apparent reference to “antifa” — and “US-based militia organizations,” the company said. These groups “have demonstrated significant risks to public safety,” Facebook said, even if they “do not meet the rigorous criteria to be designated as a dangerous organization and banned from having any presence on our platform.”

What’s less clear is just how effective these steps will be. NBC News recently reported that the largest QAnon groups have millions of members (it’s so far unclear how many members were in the groups Facebook took down), and QAnon memes have seeped into other conspiracy theory groups on the platform. Facebook also has a mixed track record when it comes to dealing with conspiracy theories. For example, the company said last year it would limit the visibility of anti-vaccine misinformation, but these groups have continued to find new ways to evade these efforts.