Advertisement
Engadget
Why you can trust us

Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

Midjourney ends free trials of its AI image generator due to 'extraordinary' abuse

The tool had been used to fake images of Trump and the Pope, among others.

Midjourney

Midjourney is putting an end to free use of its AI image generator after people created high-profile deepfakes using the tool. CEO David Holz says on Discord that the company is ending free trials due to "extraordinary demand and trial abuse." New safeguards haven't been "sufficient" to prevent misuse during trial periods, Holz says. For now, you'll have to pay at least $10 per month to use the technology.

As The Washington Post explains, Midjourney has found itself at the heart of unwanted attention in recent weeks. Users relied on the company's AI to build deepfakes of Donald Trump being arrested, and Pope Francis wearing a trendy coat. While the pictures were quickly identified as bogus, there's a concern bad actors might use Midjourney, OpenAI's DALL-E and similar generators to spread misinformation.

Midjourney has acknowledged trouble establishing policies on content. In 2022, Holz justified a ban on images of Chinese leader Xi Jinping by telling Discord users that his team only wanted to "minimize drama," and that having any access in China was more important than allowing satirical content. On a Wednesday chat with users, Holz said he was having difficulty setting content policies as the AI enabled ever more realistic imagery. Midjourney is hoping to improve AI moderation that screens for abuse, the founder added.

Some developers have resorted to strict rules to prevent incidents. OpenAI, for instance, bars any images of ongoing political events, conspiracy theories and politicians. It also forbids hate, sexuality and violence. However, others have relatively loose guidelines. Stability AI won't let Stable Diffusion users copy styles or make not-safe-for-work pictures, but it generally doesn't dictate what people can make.

Misleading content isn't the only problem for AI image production. There are longstanding concerns that the pictures are stolen, as they frequently use existing images as reference points. While some companies are embracing AI art in their products, there's also plenty of hesitation from firms worried they'll get unwanted attention.