Over the last year or so, Facebook's public statements have reflected the ongoing process of its moderation policies, both when it comes to election fraud and the even pricklier issue of hate speech. Now, beyond its publicly available Community Standards and various apologies, Motherboard has published internal documents showing what it's actually policing, and how that has changed over time.
One portion points to Matt Furie's Pepe the Frog character, which it has been outright banned when used in hateful contexts, even as other cartoon characters have not. In another, Motherboard highlights policies that have changed in detail and scope, attempting to draw lines between white supremacy, white nationalism and white separatism. While praise, support and representation of the first one as an ideology is banned, for nationalism and separatism they are not.
Apparently, Facebook has its own definitions of what makes a group a hate group, which differs from a list compiled by the Anti-Defamation League, for example. Facebook confirmed in a statement that it checks "whether an individual or group should be designated as a hate figure or organisation based on a number of different signals, such as whether they carried out or have called for violence against people based on race, religion or other protected categories."
In one of the documents it specifically cited the events in Charlottesville at a white nationalist rally where a man drove into and killed a young woman who was protesting, as a reason for re-educating moderators, and admitted the differences between these groups can be blurred. Why it waited until after the rally, and wasn't spurred on by the long history of overlapping white power groups in the US, or the 2016 Charleston church shooting, to provide more information, is left unclear. Facebook has said "We are opposed to hate speech in all its forms, and don't allow it on our platform," but exactly who decides when that definition changes and what impact it has on the world we live in is still murky.