Advertisement

Facebook’s cluelessness helped amplify Myanmar tensions

A Facebook-commissioned report says the company made it too easy for bad actors to spread hate in Myanmar.

As the United States goes to the polls for the 2018 midterms, Facebook's influence is once again under scrutiny. The social network has published a report it commissioned in August from Business for Social Responsibility (BSR). The San Francisco–based nonprofit was tasked with investigating Facebook's role in enabling political violence in Myanmar. Essentially, the report says that while it wasn't the root cause, Facebook amplified calls to violence.

The Republic of the Union of Myanmar, formerly Burma, is a country in Southeast Asia bordered by India, Bangladesh, Thailand, Laos and China. It is rich in mineral wealth but is one of the poorer nations in the region, and its political situation is problematic. Its first democratic election in a generation took place in 2015, but its de facto leader, Aung San Suu Kyi, is constitutionally barred from being president.

Myanmar is a majority Buddhist country with smaller Christian and Muslim populations (6 and 5 percent, respectively). Its inhabitants are ethnically diverse, and tensions between various groups remain high in the country. The UN Human Rights Council has said that the country's military (which has a political role) has played on these tensions to create violence and justify its power.

It's this history that Facebook failed to appreciate when it began entering the country following the reduction of internet restrictions in 2010. By 2016 it was using its Free Basics data service to help get vast numbers of the population online. And now, "for the majority of Myanmar's 20 million internet-connected citizens, Facebook is the internet."

The report adds that Myanmar's users have a low level of "digital literacy" and have struggled to distinguish truth from lies. This has allowed the platform to be used to spread rumors that led to "death threats and public disclosure of private information." Facebook is also used as a platform for extortion, with individuals threatening to post Photoshopped images of women onto porn sites unless paid.

Myanmar's digital illiteracy manifests itself in other interesting ways, like the refusal to use the Share button. Instead, users prefer to "copy and paste the content," making it harder for moderators to trace the origins of fake stories. And this, coupled with Facebook's ubiquity, has made it easier for "those seeking to spread hate and cause harm" -- with a corresponding rise in real-world violence. But while the report doesn't lay all of the blame at Facebook's feet, the service has been used "by bad actors to spread anti-Muslim, anti-Rohingya and anti-activist sentiment."

One such event was the forwarding of a chain letter that warned Buddhists to prepare for an attack from Muslim groups. A mirror letter was sent to Muslim groups, warning them that a violent attack from militant Buddhist groups was forthcoming. Facebook's failure to deal with the letter led to a personal apology from CEO Mark Zuckerberg, who said that the company would hire more Burmese-speaking moderators.

The interfaith tensions have been used by Myanmar's military to justify human rights violations of both Rakhine Buddhists and Rohingya Muslims. Violations of the former group include forced labor, land seizure and rape, while the latter have been declared essentially stateless people. That has led to Rohingya being tortured, villages being burned down and ethnic cleansing. More than 700,000 Rohingya have since fled to Bangladesh for asylum.

Facebook first acted in 2017, before the publication of the controversial chain letter, by censoring posts from or supporting the Arakan Rohingya Salvation Army (ARSA). The group was described as insurgents against the government, and government officials praised Facebook for doing so. In response, activist Mohammad Anwar said that Facebook was "colluding with the genocidaires." Then, in August 2018, the company chose to follow guidance from the UN Human Rights Council and ban accounts held by prominent military leaders. The holders were described as either committing or enabling "serious human rights abuses."

The report highlights the fine line that Facebook has to walk when it comes to dealing with abuses like this in the future. One solution would be to hire a number of local moderators based in the country. But that could lead to reprisals by local leaders, as well as the seizure and surveillance of users by hostile governments. The move to ban prominent military leaders will likely cause repercussions and, potentially, further crackdowns on internet access.

The report concludes by recommending that Facebook create a human rights policy and making people accountable for upholding it. In addition, it says that the company needs to rethink its content moderation policies so that it's harder for violent material to remain online. It also needs to invest in people and tools to help take down content that incites violence and prevent it from spreading.

The report also says that Facebook should help the population to be better informed online and improve fact-checking initiatives. There's no existing structure that can be leveraged in the country, so Facebook may have to set one up itself. And, of course, Facebook should harden its infrastructure to prevent fake news, weaponized information and violence from ruining the country's 2020 election.

Facebook issued a response, written by Product Policy Manager Alex Warofka, which was published in tandem with the report. He said that the company had failed to prevent its platform from being used to incite offline violence and "foment division," adding that "we can, and should, do more." Part of the apology is a plan to invest in people, technology and partnerships to improve both moderation and tools.

Warofka also pledged that Facebook would bring the number of moderators who can speak Myanmar's languages to "at least 100 by the end of 2018" -- 99 of which, he added, have now been hired. He said that these moderators are "improving the development and enforcement of our policies." Warofka said that "Facebook alone cannot bring about the broad changes needed to address the human rights situation in Myanmar." This is true, but the company must take responsibility for what its platform has already done and help, to the extent possible, to fix things.

Facebook didn't cause Myanmar's structural problems, but it has helped to exacerbate them. The company didn't appreciate the digital illiteracy of the population or think to acknowledge its political tensions. After pursuing growth for growth's sake, Facebook created the landscape for fake news and violence to flourish. Its failure to appreciate the consequences of its actions has real effects -- a lesson that it apparently keeps needing to learn.