Advertisement

Discord (still) has a child safety issue

A report identified 35 cases that led to physical offenses following communication on the site.

Dado Ruvic / reuters

A new report has revealed alarming statistics about Discord's issues with child safety. Over the past six years, NBC News identified 35 cases of adults being prosecuted on charges of "kidnapping, grooming or sexual assault" that allegedly involved Discord communication. At least 15 of those have resulted in guilty pleas or verdicts, with "many others" still pending.

Reporters also discovered 165 more cases, including four crime rings, in which adults were prosecuted for sharing CSAM (child sexual exploitation material) via Discord or allegedly using the site to extort children into sending sexually graphic images of themselves, a practice known as sextortion. The illegal acts often take place in hidden communities and chat rooms, according to the report.

A simple Google search of "site:justice.gov Discord" yields a large number of hits, many of a disturbing nature. In one case identified by NBC News, "a teen was taken across state lines, raped and found locked in a backyard shed, according to police, after she was groomed on Discord for months."

"What we see is only the tip of the iceberg," the Canadian Center for Child Protection's Stephen Sauer told NBC News. And it's not the first time Discord has been under fire for its handling of child abuse complaints. Last year, CNN also identified numerous incidents of CSAM, with some parents claiming that Discord offered little help.

Earlier this year, the National Center on Sexual Exploitation (NCOSE) issued a statement titled "Discord Dishonestly Responds to How it Handles Child Sexual Abuse Material After Being Named to 2023 Dirty Dozen List." Among other things, it noted that CSAM links were identified and reported, but still available on Discord's servers "over two weeks later." It added that Discord's actions in responding to the issues are "far too passive, failing to proactively search for and remove exploitation." It recommended that the site, which currently has over 150 million users, ban minors "until it is radically transformed."

In a recent transparency report, Discord said its "investment and prioritization in child safety has never been more robust," adding that it disabled 37,102 accounts and removed 17,425 servers for child safety violations. The company's VP of trust and safety, John Redgrave, told NBC News that he believes the platform's approach to the issue has improved since Discord bought the AI moderation company Sentropy in 2021. It uses several systems to proactively detect CSAM and analyze user behavior, and Redgrave said that he thinks the company now proactively detects most materials that already "identified, verified and indexed."

However, the systems can't currently detect child sexual abuse materials or messages that have yet to be indexed. In a review of Discord servers created over the past month, NBC News found 242 that use thinly disguised terms to market CSAM.

Discord isn't the only social media company with CSAM problems. A recent report found that Instagram helped "connect and promote a vast network of accounts" devoted to underage-sex content. However, Discord has reportedly made it particularly difficult for law enforcement at times, in one case asking for payment after the Ontario Police asked it to preserve records, according to the report. Engadget as reached out to Discord for comment.