Advertisement

Facebook admits its image screening fell short

The company says it has improved its content moderation process.

Chris Ratcliffe/Bloomberg via Getty Images

To say that Facebook has some egg on its face right now would be an understatement. The social network not only didn't take down some sexualized images of children, but reported the BBC when it drew these images to its attention. However, the company now says it has turned a corner. Facebook's Simon Milner tells the UK's Home Affairs Committee that the incident showed the company's moderation system "was not working." The offending photos have since been taken down, he says, adding that the process should be fixed.

It's not clear just what a fix entails, or just how much of an improvement Facebook made. The company tends to closely guard the details of its moderation approach, so there's a lot we don't know (we've asked Facebook for comment). And the internet giant has been accused of simultaneously under- and overreacting to content issues, either by leaving content up despite known abuse or taking down material that's not at all controversial. While it would be difficult or impossible for Facebook to catch absolutely every violation, it'll have to show make incidents like the BBC investigation a thing of the past.

If there's any consolation for Facebook, it's that it isn't the only one on the hot seat. The Home Affairs Committee also grilled Google (specifically, YouTube) and Twitter over their own troubles fighting online hate speech. Both admitted that they had to do more to keep hate off their services. Google wasn't specific about its solutions, but Twitter acknowledged that it had to be more communicative when users file abuse reports. These kinds of issues are relatively common among internet giants, in other words -- it's just that Facebook's latest crisis was more embarrassing than most.