Advertisement

Facebook will remove calls for violence in preparation for Derek Chauvin verdict

It also promises to protect the family and memory of George Floyd.

Jason Armond via Getty Images

As cities and communities across the US anxiously wait for a verdict in the trial of Derek Chauvin, the former police officer accused of killing George Floyd, Facebook says it's "doing what we can" to prepare. The company claims it's "working around the clock" to identify potential threats on and off its platforms. Specifically, it will remove posts and events that call on people to bring arms to Minneapolis and says it will deem other places as "high-risk locations," depending on how the situation develops. Public officials in cities like New York and Los Angeles anticipate there will be protests once the jury announces its verdict. Facebook says its goal is to protect peaceful demonstrations while limiting content that could lead to civil unrest.

"We want to strike the right balance between allowing people to speak about the trial and what the verdict means, while still doing our part to protect everyone's safety," the company said. "We will allow people to discuss, critique and criticize the trial and the attorneys involved."

Additionally, the company says it's working to protect the memory of George Floyd and his family from harassment by removing posts that praise, celebrate or mock his death. It says it may also preemptively limit content that it predicts will end up breaking its Community Standards.

Facebook clearly hopes to avoid a situation like the one that happened last August. The company failed to down an event page that had called on members of the Kenosha Guard Facebook group to "take up arms" in response to the protests that broke out following the death of Jacob Blake. Despite hundreds of people flagging the event for Facebook, the company never actually took the page down. After Facebook said it had removed both the pages for the Kenosha Guard and their event, it came out that the latter was actually deleted by its organizers. Mark Zuckerberg attributed the moderation error to an "operational mistake," and said the people who initially reviewed the reports hadn't properly escalated them to the right team.