As the company has done a handful of times in the past, Facebook has agreed to change one of its policies following a ruling from the Oversight Board. In response to a decision the panel came to on a video involving two individuals wearing blackface, the company says it will tweak the policy rationale section of its community guidelines on hate speech to add more context on why it prohibits harmful stereotypes. "We want our policies to be consistent, and we do not often publish rationales for each specific policy line in our Community Standards," Facebook says. It notes it's now reconsidering its previous stance.
The company also plans to do a better job when notifying people that they've violated its rules. This is something the Oversight Board has returned to multiple times in its decisions. "We've made some progress on our hate speech notifications using an additional classifier that is able to predict what kind of hate speech is contained in the content," Facebook says.
As things stand, the company has deployed updated notifications for English-speaking users that include messaging specific to their infraction. So if Facebook removes someone's content because of the inclusion of blackface, the message points outs the post was dehumanizing. "We'll continue to explore more granularity and expand these notifications for hate speech to other languages in the future," the company said. It also plans to roll out that same feature to Instagram sometime in the next few months and says it will continue to explore how to make its content policy notifications more transparent.