As if the issues like revenge porn and AI-powered facial recognition searches weren't creepy enough, a Motherboard report reveals yet another unsettling use of technology: "deepfakes." Within a month of locating a Redditor who used machine learning to swap pictures of mainstream actresses onto the bodies of women performing in porn movies, the outlet has found people using an app based on his techniques to create videos using images of women they know.
By scraping social media accounts for a full library of photos and using web apps that find porn with women who have faces that resemble the person they're basing it on, it's automating a process that some revenge porn sites had already been doing manually. We've seen similar technology used in movies for years, but with AI running on desktop GPUs or using cloud computing, random people suddenly have access and are using it in unsettling ways (like the Nicholas Cage-on-Amy Adams scene shown here).
Worse yet, Wired spoke to a lawyer who helped write laws against "nonconsensual porn," and she said we may not be able to rely on those legal protections to stop it. While it's possible that the app creator could be liable for damages, or that people who find out they've had their faces used could sue for defamation, there are a number of hurdles involved -- like finding out that someone has made one of these about you in the first place.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.