"It absolutely is a major risk factor for things like depression, post-traumatic stress disorder and anxiety," said Lucy Bowes, an associate professor of experimental psychology at the University of Oxford who specializes in cyberbullying.
"It absolutely is a major risk factor for things like depression, post-traumatic stress disorder and anxiety."
"I suspect the people who are vulnerable drop out quite early because it hurts so much," said Bowes. "Clearly it's not OK for anyone, but I imagine it's quite self-selecting."
She warns that the longer a moderator works, the greater the risk of developing psychological problems. While many people will be resilient enough to cope with the abuse up to a point, others will crack at a certain threshold. Bowes said that moderators with pre-existing mental health issues would be especially at risk.
"This will build up and build up, and you'll start seeing things like missing work, difficulty sleeping, weight gain or weight loss. You'd start to see those precursors to poor mental health," she said.
Many of the moderators interviewed said they worried about their fellow moderators burning out under the pressure caused by their online harassers, but they all claimed not to be affected themselves.
Bowes said Reddit has an obligation to inform new moderators of the kind of abuse they can expect to receive before they begin moderating and to give advice on how to handle it. There also needs to be a clear explanation of how the company intends to handle and react to the abuse when moderators report it, said Bowes.
"Reddit has a responsibility in making sure that people working for them know what they need to do. So taking screenshots, for example, and recording so they have as much information as possible should it go to prosecution."
That's something that Emily also called for. "It would be good to have a country-specific resource, because the laws on cyberbullying in the UK are different to [the US], which would be different to other countries."
"Reddit has a responsibility in making sure that people working for them know what they need to do."
William agreed. "It's so informal. We could do with a more robust system."
Some moderators have attended Mod Roadshow meetings with administrators that travel across the US and UK to meet moderators. These events were supposed to find ways to make moderation easier, but the moderators were lukewarm in their response. Some said it simply highlighted the gulf between the moderators and the administrators and that they felt patronized by the administrators, who explained the basics of moderation.
Moderators also complained of feeling ignored when they report abusive messages. They said they almost never hear back from administrators and aren't told whether the account was removed from the site or not.
"Reddit has gone far out of its way to be as little involved with moderation as possible," said Velo. "Most people don't even try to send [abusive] users to the admins anymore, because it seems like they never respond and if they do it's so vague and unsatisfactory."
Additionally, Bowes said Reddit should be checking in regularly to make sure the moderators are OK. They should also offer resilience training to the moderators to cope, she said.
"Reddit has gone far out of its way to be as little involved with moderation as possible."
"This is a large company. It should have the resources to be able to do this," said Bowes.
Some moderators would like to see Reddit act on Bowes' suggestions.
"I would absolutely welcome that," said Velo.
"If you're a moderator on Reddit for certain subreddits that are large enough, you should get moral support -- actual psychological support to talk to a specialist after you get absolutely abused," said Allam.
However, other moderators also expressed a degree of sympathy for Reddit: The site is enormous, and it would be expensive to offer counseling services to moderators, they admit.
But Reddit's size may not be a defense for tackling the issue of abuse in general. According to a research project at Stanford University, just one percent of Redditors are responsible for 74 percent of all conflicts on the site.
It's a relatively small but troublingly devoted community of users who type the endless violent death and rape threats.
When asked for comment, Reddit said it is "constantly reviewing and evolving" its policies, enforcement tools and support resources. It also noted that it has doubled its admin team in the past two years to ensure it has "the teams in place to make the improvements needed," and is building solutions to detect and address bad behavior to ensure "it doesn't become a burden on moderators."
Update 1:30PM ET:
Reddit has provided an expanded comment on this story. It also pointed to its r/ModSupport community, Mod Help Center and updated report flow for making Reddit aware of infractions such as harassment. Its full, updated statement follows:
"We look into all specific allegations that are reported to us. Harassment and persistent abuse toward moderators are not acceptable behaviors on the site. We are constantly reviewing and evolving our site-wide policies, enforcement tools, and community support resources. Reddit has also doubled in size in the past two years in order to ensure we have the teams in place to make the improvements needed. This includes expanding our Trust and Safety team and our Community team, as well as building engineering solutions for detecting and addressing bad behavior so it doesn't become a burden on moderators. We are also particularly focused on improving the tools available for moderators, and we are grateful for the feedback we have received from them both online and through our efforts to meet them in person. Since 2017, we've hosted 13 events around the United States and in London to give the moderators an opportunity to share their views with us in person, which we always appreciate. We know there is more to do, and we will continue to evolve the human and technological resources available to ensure that Reddit is a welcoming place."
Credits:
Reporter: Benjamin Plackett
Editors: Aaron Souppouris, Jay McGregor, Megan Giller
Images: Reddit/Point
Video by Point
Narrator: Jay McGregor
Producers: Jay McGregor, Aaron Souppouris
Editor: Anton Novoselov