Advertisement

Facebook showed terrorists the profiles of people moderating them

One moderator fled his home country of Ireland and went into hiding.

Have you ever wondered who takes on the grueling, unforgiving task of combing Facebook's groups and personal profiles for terrorist activity? Meet Community Operations workers, who are often paid low wages for highly specialized and difficult work. And now, the job has become even less appealing: It turns out that a bug inadvertently exposed the personal Facebook profiles of those moderating these violent graphic images to terrorists.

The moderators were alerted that something was going on when they started receiving friend requests on their personal accounts from the very people and organizations they were investigating. Facebook's security team later learned that a bug had revealed the moderators' Facebook profiles within the activity logs of the groups they were looking into and shutting down.

Facebook's reaction was to put together a "task force of data scientists, community operations and security investigators," according to internal emails obtained by The Guardian. However, the bug remained in place for two weeks after it was discovered, even as Facebook's head of global investigations, Craig d'Souza, was reassuring moderators that it was unlikely the terrorists would connect these personal profiles to moderation activities.

One moderator, though, wasn't willing to take a chance. He fled Ireland, where he'd moved as a child as an asylum seeker from Iraq, unsatisfied with Facebook's offer of a home alarm system and transportation to and from work. "The punishment from Isis for working in counter-terrorism is beheading," the unnamed worker told The Guardian. "All they'd need to do is tell someone who is radical here." He's since returned to Ireland, but is now suing Facebook for psychological damage.

News of this breach comes on the heels of Facebook's renewed commitment to counterterrorism. They recently reported on their efforts to thwart terrorism on the social network and their increasing use of AI to identify threats. The report also mentioned that they are hiring 3,000 more Community Operations workers -- workers like this unnamed Irish moderator, paid the equivalent of $15/hour to become an expert on analyzing and identifying suspected terrorist activities. It's not an easy task, to moderate social activity on a network with over 2 billion users, but protecting the privacy of those who do this unforgiving work should be at the top of Facebook's priority list.