Facebook and Reddit ban hate groups in wake of Charlottesville

The communities were condoning and encouraging violence.

AOL, Roberto Baldwin

It's not just domain registrars and game chat services that are cracking down on neo-Nazis in the wake of the racism-fuelled violence in Charlottesville. Facebook and Reddit have both confirmed that they've shut down numerous hate groups in the wake of the attacks. Reddit tells CNET that it shut down the /r/Physical_Removal subreddit for content that "incites violence" and thus violates its content policy. Users in the group hoped that people in anti-hate subreddits and at CNN would be killed, supported concentration camps and even wrote poems about killing.

Facebook, meanwhile, banned several hate groups (including Physical Removal), pulled the event page for the Unite the Right march that conducted the violence and removed most links to the (now inaccessible) Daily Stormer article trying to justify the murder of protester Heather Heyer. The only exceptions are those posts condemning the story.

A spokesperson for the social network says it will keep removing Facebook posts which include hate speech, praise violent acts or support hate groups. The site doesn't ban groups purely for their political views, it says -- it's when they promote violence that they cross the line.

These moves won't stop neo-Nazis and other racist groups from organizing or promoting violence, but it may push them to harder-to-find corners of the internet. Daily Stormer, for example, moved to a Dark Web site after GoDaddy and Google dropped their domain registrations. That's a mixed bag for the anti-hate camp. While it reduces public exposure to their ideologies, it also hides their discussions; they may plan protests and attacks in secret where before they'd operate in a relatively open place. The Facebook and Reddit bans aren't necessarily guaranteed to produce a similar effect, but it won't be surprising if history repeats itself.