Facebook is under new scrutiny for its moderation practices in Europe
Facebook doesn't do enough to protect content reviewers, according to a current moderator.
Facebook is once again facing questions about its treatment of content moderators after a moderator told an Irish parliamentary committee that the company doesn’t do enough to protect the workers who sift through violent and disturbing content on the platform.
Isabella Plunkett, who currently works for Covalen, a Irish outsourcing company that hires content moderators to work as contract staff, told the committee that non-employee moderators aren’t given adequate access to mental health resources. For example, Covalen allows for an hour and a half of “wellness time” each week, bu the company-provided “wellness coaches” are not mental health professionals, and are not equipped to help moderators process the traumatic content they often deal with. Plunkett told the committee that these wellness coaches sometimes suggested activities like painting or karaoke.
“The content is awful, it would affect anyone,” she said at a press conference following the hearing. “No one can be okay watching graphic violence seven to eight hours a day.” She said moderators should be afforded the same benefits and protections as actual Facebook employees, including paid sick time and the ability to work from home. Plunkett also raised Facebook’s reliance on non-disclosure agreements, which she said contributed to a “climate of fear” that makes moderators afraid to speak out or seek outside help.
In a statement, a Facebook spokesperson said the company is “committed to working with our partners to provide support” to people reviewing content. “Everyone who reviews content for Facebook goes through an in-depth training programme on our Community Standards and has access to psychological support to ensure their wellbeing,” the spokesperson said. “In Ireland, this includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment. We are also employing technical solutions to limit their exposure to potentially graphic material as much as possible. This is an important issue, and we are committed to getting this right.”
This is far from the first time these issues have been raised. The workplace conditions of content moderators, who spend their days wading through the worst content on the platform, has long been an issue for Facebook, which depends on non-employee moderators around the world. The company last year agreed to a $52 million settlement with U.S.-based moderators who said their jobs resulted in PTSD and other mental health issues.
As part of the settlement, Facebook agreed to make several changes to the way it handles content that’s funneled to moderators for review. It introduced new tools that would allow them to view videos in black and white and with audio muted in an effort to make the often violent and graphic content less disturbing to watch. It also added features to make it easier to skip to the relevant parts of longer videos to reduce the amount of overall time spent watching the content. The company has also made significant investments in AI technology, with the hopes of one day automating more of its moderation work.
But Facebook may soon have to answer questions on whether these measures go far enough to protect content moderators. The committee is expected to ask representatives from Facebook, and its contracting companies, to appear at another hearing to face questions about their treatment of workers.