Advertisement

YouTube says it'll finally stop recommending conspiracy videos

We'll believe it when we see it.

YouTube says it will stop recommending conspiracy videos. Given that even the most innocuous of searches can lure you down an algorithmically generated path that almost invariably leads to videos containing outlandish claims, the move seemed inevitable. YouTube's Kids app wasn't immune either, as such videos were popping up there.

Google's video streaming service now says it won't suggest "borderline" videos that come close to violating community guidelines or those which "misinform users in a harmful way." Examples of the types of videos it will bury include 9/11 misinformation, flat earth claims and so-called miracle cures for major illnesses. The decision affects less than 1 percent of videos, YouTube says, but given the vast number of clips on the platform, the move will impact millions of them.

"We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users," YouTube said in a blog post. An algorithm will decide which videos won't appear in recommendations, rather than people (though humans will help train the AI). That's perhaps a questionable decision, since algorithms are a root cause of the problem in the first place. The policy will be enforced gradually, starting with a small number of videos in the US before expanding worldwide as the algorithm becomes more refined.

The borderline videos will still appear in search results, however, and you'll still see them in your recommendations if a channel you subscribe to publishes such content.

A BuzzFeed News report on Thursday found that conspiracy videos (along with misogynist and ultra-partisan videos) frequently appeared in the "Up Next" section -- effectively a neverending playlist based on what you've previously watched. The report suggested that even without logging in or starting with a clean watch history, YouTube's algorithm still recommended such videos.