The activists, which represent six civil society organizations, condemned Facebook for the way it handled a chain letter that spread across the social network in September. The content warned Buddhists of an imminent attack from Muslims, while Muslims were told to prepare for violence from militant Buddhist groups. The civil society organizations said the messages caused widespread panic, and that such scaremongering was becoming increasingly prevalent on the platform.
In an interview last week, Zuckerberg used this incident as an example of Facebook's effectiveness in tackling hate speech, claiming that its systems had detected and removed these messages. The activists, however, said they were forced to flag the content repeatedly, and Facebook only stepped in to help after its employees were bombarded with appeals from activists and residents in the country. Their letter to Zuckerberg condemned "an overreliance on third parties, a lack of a proper mechanism for emergency escalation, a reticence to engage local stakeholders around systemic solutions and a lack of transparency."
In his personal reply, Zuckerberg apologized for "not being sufficiently clear about the important role" these organizations play. He said his "intention was to highlight how [Facebook] is building artificial intelligence to help better identify abusive, hateful or false content" before it's flagged by users, and he added that Facebook has "added dozens more Burmese language reviewers" and has increased the number of people across the company on "Myanmar-related issues".
However, his response has not reassured activists, which say Facebook has a history of pledging to do more to help quell violence in the country, but has not made good on its promises. Speaking to the New York Times, Jes Peterson, chief executive of Myanmar-based innovation lab Phandeeyar, said: "It's great that he's engaging personally with this, but the stuff he's talking about is really not that much different from what they've been saying for the past few years." He added that, "Dozens of content reviewers is not going to cut it."
Activists in other developing nations have raised similar concerns about Facebook's behaviour, with politicians and civil organizations in Indonesia, the Philippines and Sri Lanka calling for greater measures against the spread of misinformation. But it's clear Facebook recognizes its duty to do more for its users. In an interview last month, head of Facebook's News Feed Adam Mosseri admitted that he and other executives "lose some sleep" over the thought that Facebook incites real-world violence.