Advertisement

Facebook accused of shielding far-right activists who broke its rules

Facebook denies that it shielded Britain First for profit.

An upcoming documentary reportedly reveals that Facebook has been protecting far-right activists, even though they would normally have been banned over rule violations. UK's Channel 4's documentary series Dispatches sent a reporter undercover and found toxic content, including graphic violence, child abuse and hate speech that moderators from Facebook contractor CPL refused to ban. Facebook admitted that it made mistakes with regard to content moderation, but denied that it sought to profit from the extreme content.

Facebook pages are normally deleted if they contain more than five posts that violate the sites rules. But Channel 4's undercover report discovered that popular pages are protected from those rules by a second tier of moderation called a "shielded review." In such cases, Facebook staff, rather than external contractors like CPL, decide on whether to take action.

Normally, shielded reviews are only given to government and news organizations, but it was also granted to extremist UK far-right group Britain First and its leader Stephen Yaxley-Lennon, aka Tommy Robinson. As such, content that violated Facebook's rules, including hate speech, was left up. A moderator told the reporter that "they have a lot of followers so they're generating a lot of revenue for Facebook."

He also said that Facebook trains contractors to ignore visual evidence that a user is under 13 unless there's an admission that the person is underage. Consequently, it failed to take down posts showing abuse and violence against children. That includes one post from 2012 showing a grown man beating a small child that has been shared 44,000 times and remained up during the making of the documentary. Another meme showed a little girl being held underwater, with the caption "when your daughter's first crush is a little negro boy."

In response to the documentary, which airs tonight at 9 PM in the UK, Facebook admitted in a post from Global Policy Management VP Monika Bickert that it made "mistakes," and that "we have been investigating exactly what happened so we can prevent these issues from happening again."

Facebook also provided a transcript of Channel 4's interview with VP of public policy Richard Allen. He said that moderation mistakes were made because training staff were using the "old training decks using wrong examples, wrong material" and that the company has taken steps to rectify that. He also claimed that Facebook had taken down the video of the young child being beaten, but Channel 4's reporter noted that it "is still on Facebook right now."

The report also found that instances of hate speech, including a comment directed at Muslim immigrants to "fuck off back to your own countries," were allowed to stay on the site. Allen called that content "right on the line," because it was addressed to Muslim immigrants and not just Muslims. "People are debating very sensitive issues on Facebook, including issues like immigration. And that debate can be entirely legitimate," said Allan.

Facebook also denied that it kept hate speech content online for profit. "It has been suggested that turning a blind eye to bad content is in our commercial interests. This is not true," wrote Bickert. "Creating a safe environment where people from all over the world can share and connect is core to Facebook's long-term success."

However, the social network has been in the news before for giving Holocaust deniers and others a forum. Channel 4 also interviewed Roger McNamee, an early Facebook investor, who said that extremist content can be very profitable. " Facebook understood that it was desirable to have people spend more time on site if you're going to have an advertising-based business," he said. "It's the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform."

Update: In response to Engadget's request, Facebook released a comment and said that it has taken action since being contacted by Dispatches, including:

  • A review of training practices across our contractor teams, including CPL

  • Refresher training by Facebook employees for all trainers working at CPL

  • A review of staffing at CPL to ensure that anyone who behaves in ways that are inconsistent with Facebook's values no longer works to review content on our platform, and

  • Updated training materials for all reviewers -- clarifying our policies in all the areas raised by Dispatches.

It's clear that some of what is shown in the program does not reflect Facebook's policies or values, and falls short of the high standards we expect. We take these mistakes in some of our training processes and enforcement incredibly seriously and are grateful to the journalists who brought them to our attention. Where we know we have made mistakes, we have taken action immediately. We are providing additional training and are working to understand exactly what happened so we can rectify it.

Comments to this article were available for the first 24 hours after publication only, and have since been closed.