Advertisement

Discord bans teen dating servers and the sharing of AI-generated CSAM

The chat service has updated its policy to protect underage users.

NurPhoto via Getty Images

Discord has updated its policy meant to protect children and teens on its platform after reports came out that predators have been using the app to create and spread child sexual abuse materials (CSAM), as well as to groom young teens. The platform now explicitly prohibits AI-generated photorealistic CSAM. As The Washington Post recently reported, the rise in generative AI has also led to the explosion of lifelike images with sexual depictions of children. The publication had seen conversations about the use of Midjourney — a text-to-image generative AI on Discord — to create inappropriate images of children.

In addition to banning AI-generated CSAM, Discord now also explicitly prohibits any other kind of text or media content that sexualizes children. The platform has banned teen dating servers, as well, and has vowed to take action against users engaging in this behavior. A previous NBC News investigation found Discord servers advertised as teen dating servers with participants that solicited nude images from minors.

Adult users had previously been prosecuted for grooming children on Discord, and there are even crime rings extorting underage users to send sexual images of themselves. Banning teen dating servers completely could help mitigate the issue. Discord has also included a line in its policy, which states that older teens found to be grooming younger teens will be "reviewed and actioned under [its] Inappropriate Sexual Conduct with Children and Grooming Policy."

Aside from updating its rules, Discord recently launched a Family Center tool that parents can use to keep an eye on their kids' activity on the chat service. While parents won't be able to see the actual contents of their kids' message, the opt-in tool allows them to see who their children are friends with and who they talk to on the platform. Discord is hoping that these new measures and tools can help keep its underage users safe along with its old measures, which include proactively scanning images uploaded to its platform using PhotoDNA.