Advertisement

Meta's Oversight Board escalates Holocaust denial report

The case involves an image of Squidward edited to say that the Holocaust didn't happen.

ASSOCIATED PRESS

Meta's Oversight Board has put a new case, which it believes is relevant to its strategic priorities, under the spotlight. In a post, the board has announced that over the next few weeks, it's reviewing and accepting public comments for a case appealing Meta's non-removal of content that denies the Holocaust on its platforms. Specifically, this case pertains to a post going around on Instagram that puts a speech bubble on an image with Squidward, a character from SpongeBob SquarePants, denying that the Holocaust had happened. Its caption and hashtags also targeted "specific geographical audiences."

The post was originally published by an account with 9,000 followers in September 2020, and it was viewed around 1,000 times. A few weeks after that, Meta revised its content policies to prohibit Holocaust denial. Despite the new rules and multiple users reporting it, the post wasn't quickly removed. Some of the reports were auto-closed due to the company's "COVID-19-related automation policies," which were put in place so that Meta's limited number of human reviewers can prioritize reports considered to be "high-risk." Other reporters were automatically told that the content does not violate Meta's policies.

One of the users who reported the post chose to appeal the case to the Board, which has determined that it falls in line with its efforts to prioritize "hate speech against marginalized groups." The Board is now seeking comments on several relevant issues, such as the use of automation to accurately take enforcement action against hate speech and the usefulness of Meta's transparency reporting.

In a post on Meta's transparency page, the company has admitted that it left the content up after initial review. However, it eventually determined that it was left up by mistake and that it did violate its hate speech policy. The company has since removed the content from its platforms, but it promised to implement the Board's decision. Meta's Oversight Board can issue policy recommendations based on its investigation, but they're not binding, and the company isn't compelled to follow them. Based on the questions the Board wants the public to answer, it could conjure recommendations that would change the way Meta uses automation to police Instagram and Facebook.