Advertisement

Mastodon's decentralized social network has a major CSAM problem

The nature of the Fediverse is causing moderation problems.

hapabapa via Getty Images

Mastodon has gained popularity over the past year as Twitter users looked for alternatives following Elon Musk’s takeover. Part of its appeal is its decentralized nature that insulates it against the whims of billionaires who speak before they think. Unsurprisingly, though, what makes it so appealing has also proven to be a headache, making content moderation all but impossible.

A study from Stanford found 112 matches of known child sexual abuse material (CSAM) over a two-day period, with almost 2,000 posts using common hashtags related to abusive material. Researcher David Thiel says, “We got more photoDNA hits in a two-day period than we’ve probably had in the entire history of our organization of doing any kind of social media analysis, and it’s not even close.” We’ve reached out to Mastodon for comment and will update this story once we’ve heard back.

Of course, the big problem with unfederated social media platforms such as Mastodon is that no one company or entity controls everything on the platform. Every instance has its own administrators, and they are the ones who are ultimately responsible. However, those admins cannot control and moderate what goes on in other instances or servers.

This isn’t uniquely a Mastodon problem, either. Meta’s popular Threads is also built around the decentralized model. While it’s not supported just yet, Threads plans on being interoperable with ActivityPub. This means Threads users will be able to follow, reply and repost content from Mastodon, and vice versa.

This creates a unique problem for Meta, which can’t control the entire moderation flow like it could with Facebook or Instagram. Even then, the company struggles to keep up with moderation. Presumably, larger instances on Mastodon and other platforms such as Threads could outright block access to problematic instances. Of course, that wouldn’t “solve” the problem. The content would still exist. It would just be siloed and left to the moderators of that specific instance to remove it.