Advertisement

Reddit bans the 'deepfake' AI porn it helped spawn

It hasn't banned every community, though.

Reddit bans the 'deepfake' AI porn it helped spawn

That didn't take long. Hot on the heels of Twitter, Reddit has updated its rules to expressly ban AI-generated "deepfake" porn. Where it previously had a single rule forbidding porn and suggestive material involving minors, it now has two -- and it's clear that you're not allowed to post "depictions that have been faked." You also can't post regular imagery of others with the goal of producing fake nonconsensual porn.

Accordingly, Reddit has cracked down on some of the offending communities. It has shut down the deepfakes subreddit that got the ball rolling, as well as YouTubefakes. It hasn't closed non-deepfake subreddits like CelebFakes, however, and it's also maintaining the communities with more innocuous intentions, such as FakeApp (the program itself) and SFWdeepfakes. At the moment, this is more about addressing the specific violations that triggered the uproar than to stamp out every potential violation of the policy.

In a statement to Engadget, Reddit reiterated the rule split and stressed that it hoped to create a "welcoming, open platform for all" that trusted its users to foster a space that "cultivates genuine conversation." You can read the full statement below.

This certainly won't eliminate all AI-produced porn. However, the deepfakes subreddit is considered the effective birthplace of the trend, and its disappearance might cut back on the volume of fake porn elsewhere. Discord, Gfycat, Pornhub and others have already vowed to ban material. The challenge, as always, is enforcement. There's no guarantee that sites can or will remove everything. Creepy face swaps may persist for a long time, even if they aren't as widespread as they have been in recent days.