The site will also start looking into harassment reports from observers, not just the victims. An "improved" machine learning system will better prioritize and organize reports, but humans will still be responsible for making decisions.
It'll take some time to "fully enforce" the new measures. However, the crackdown has already affected a number of subreddits, including the r/Braincels community aimed at incels ("involuntary celibates"). Reddit also agreed that subreddits that were explicitly sexist, racist or otherwise dedicated to bigotry or harassment would be "likely to break the rules."
Reddit explained its stance as a reflection of feedback from users who felt Reddit wasn't properly and swiftly addressing abuse. There were many instances where the company didn't take action against harassment, even the more "egregious" instances. This could ensure that the rules reflect the "spirit" of what was intended, Reddit said.
The updated policy is already raising concerns. The definitions are vague enough that there are concerns Reddit might clamp down on users and communities by applying overly subjective definitions of harassment and group targeting. Does a subreddit face a ban because some of its members run coordinated harassment campaigns and issue threats, even if the community wasn't created for that reason? And what happens to communities for people leaving religions?
Reddit may feel like it doesn't have much choice. Harassment has remained a problem on Reddit despite past bans. There's also the simple matter of sustaining the business. The company has previously had trouble courting advertisers due to toxic subreddits, and big brands might be happier knowing that their ads won't appear alongside particularly vile discussions.