Advertisement

Meta's Oversight Board calls for more inclusive rules on adult nudity

The board said Instagram incorrectly removed images of a trans and non-binary couple with bare chests and covered nipples.

Dado Ruvic / reuters

Meta's Oversight Board has overruled the company's takedowns of two Instagram posts showing a transgender and non-binary couple with bare chests and covered nipples. One of the images was posted in 2021 and the other last year. In the captions, the couple discussed trans healthcare. The posts noted that one of them planned to undergo gender-affirming surgery to create a flatter chest and that the duo was fundraising to pay for the procedure.

However, Meta took down the posts for violating its rules on sexual solicitation. The Oversight Board says that moderators reviewed the images multiple times after user reports and alerts from automated systems. The couple appealed Meta's decisions to the company and the Oversight Board. Meta determined that removing the posts was the incorrect call and restored them, but the board looked into the dual cases all the same.

The Oversight Board overruled Meta's original takedown decisions. It determined that the removal of the images was not in line with the company's "community standards, values or human rights responsibilities" and that the cases underline core issues with Meta's policies.

The board wrote that Meta's directives to moderators on when to remove posts under the sexual solicitation policy is "far broader than the stated rationale for the policy or the publicly available guidance." It claimed the discrepancy causes confusion for moderators and users. Meta itself has noted that this approach has led to content being incorrectly removed.

In addition, the board called out the inherently restrictive binary perspective of the adult nudity and sexual activity community standard. It notes that the rules, as things stand, generally don't allow Meta's users to post images of female nipples, though there are exceptions for things like breastfeeding and gender confirmation surgery.

"Such an approach makes it unclear how the rules apply to intersex, non-binary and transgender people, and requires reviewers to make rapid and subjective assessments of sex and gender, which is not practical when moderating content at scale," the board wrote. It called the current rules "confusing" and noted that the extensive exceptions (which also allow for images related to protests and breast cancer awareness) "often convoluted and poorly defined." As such, the board claimed, the policy is not workable in practice.

"The board finds that Meta’s policies on adult nudity result in greater barriers to expression for women, trans and gender non-binary people on its platforms," an Oversight Board blog post reads. "For example, they have a severe impact in contexts where women may traditionally go bare-chested, and people who identify as LGBTQI+ can be disproportionately affected, as these cases show. Meta’s automated systems identified the content multiple times, despite it not violating Meta’s policies. Meta should seek to develop and implement policies that address all these concerns."

The board recommended that the company modify its rules on adult nudity and sexual activity to include "clear, objective, rights-respecting criteria" so that everyone is "treated in a manner consistent with international human rights standards, without discrimination on the basis of sex or gender." It urged Meta to review the policy to determine if it protects users against the non-consensual sharing of images and whether other rules need to be tightened on that front. Moreover, it called on Meta to align its guidance to moderators with the public rules on sexual solicitation to minimize errors in enforcing the policy.

“We welcome the board’s decision in this case. We had reinstated this content prior to the decision, recognizing that it should not have been taken down," a Meta spokesperson told Engadget. "We are constantly evaluating our policies to help make our platforms safer for everyone. We know more can be done to support the LGBTQ+ community, and that means working with experts and LGBTQ+ advocacy organizations on a range of issues and product improvements.”

In public comments on the case (PDF), several people criticized Meta for the original decisions, claiming that there was nothing sexually explicit about the images. One user called on Meta to bring in LGBTQIA+ human rights specialists and establish policies to protect trans, non-binary and other LGBTQIA people from harassment and unfair censorship. Another called out Instagram for a double standard, accusing the platform of permitting images in which nipples are covered only by body tape while removing others where they're covered by pasties (patches that cover nipples and areolae).

One person noted that the couple "have helped me accept myself and help me understand things about myself," noting that content shared on the account is "very educational and useful." The comment added that "there is nothing sexual about their nudity and them sharing this type of picture is not about being nude and being provocative."