Reddit has had it up to here with the trolls in its forums and is finally doing something about it. In fact, Reddit's doing a lot of somethings. In a post to the company's official blog this morning, Reddit admins explained that while a vast majority of the site's 9,000 or so boards are generally civil. However, too many users are being dissuaded from participating on account of the abuse and harassment they receive from the site's trolls. In fact, a recent survey of 15,000 Reddit users shows that "the number one reason Redditors do not recommend the site—even though they use it themselves—is because they want to avoid exposing friends to hate and offensive content." That's a major indictment and a big problem for Reddit as a company (not to mention its bottom line).
As such, Reddit has taken a number of steps over the past few months to become more transparent and responsive to its users' concerns. The company released its first Annual Transparency Report in January, banned revenge porn in March and yesterday provided a more complete explanation about what content admins remove for legal reasons. Today, the company defined harassment as:
Systematic and/or continued actions to torment or demean someone in a way that would make a reasonable person (1) conclude that reddit is not a safe platform to express their ideas or participate in the conversation, or (2) fear for their safety or the safety of those around them.
Reddit is stressing that this policy change "is specifically designed to prevent attacks against people, not ideas." Users that feel they are being harassed -- in-thread, via a private message, or even through an external site that links back to Reddit -- are urged to contact the mods at email@example.com. Of course, reporting the harassment should in no way dissuade users from responding to their attackers with righteous logic as Balpreet Kaur did recently.
[Image Credit: Shutterstock]