Advertisement

Mozilla project exposes YouTube's recommendation 'bubbles'

See how YouTube’s recommendation algorithm could lead to a ‘thought bubble.’

Tomo Kihara

We’ve all seen social media posts from our climate change-denying cousin or ultra-liberal college friend, and have wondered how they came to certain conclusions. Mozilla’s new project, “TheirTube,” created by Amsterdam-based designer Tomo Kihara, is offering a glance at theoretical YouTube homepages for users in six different categories. Those personas include: fruitarian, doomsday prepper, liberal, conservative, conspiracist and climate denier. Through these different personas, Mozilla hopes to demonstrate how YouTube’s recommendation algorithm could confirm certain biases.

The six personas were created after Kihara conducted interviews with real YouTube users who experienced similar recommendation bubbles. An account was created for each persona which was then subscribed to channels that the interviewees followed. The six example homepages show what the YouTube recommendation algorithm recommends any given persona each day. Thus, the fruitarian account sees videos like “What’s really inside your vitamins and supplements” and “No more mucus! Free yourself!” while the conspiracist is recommended videos like “COVID 911 - INSURGENCY - WAKEUP CALL” and “Loch Ness Monster sightings.”

Mozilla sums up the types of videos YouTube’s recommendation algorithm is likely to suggest for each persona. The fruitarian will see videos showing “how to have a hardcore organic life.” Videos for the prepper will “explore apocalyptic scenarios and how to “prepare” for them.” The liberal is recommended videos that “tend to support notions like feminism and multiculturalism” -- while the conservative will see videos criticizing these ideologies. YouTube will suggest the conspiracist videos that “suggest global events are in fact conspiracies.” Finally, the climate denier will see videos that attempt to “‘debunk’ scientific evidence about global warming.”

The suggested videos aren’t always necessarily fake or misleading. But Mozilla notes that the YouTube algorithm is designed to amplify content that will keep us clicking, even if that content is “radical or flat out wrong.”

“This project raises questions like: What if platforms were more open about the recommendation bubbles they created? And: By experiencing other users’ bubbles, is it possible to gain a better perspective of your own recommendation environment?” said Tomo Kihara, creator of TheirTube.

The YouTube recommendation algorithm accounts for 70 percent of videos watched on the site, according to Mozilla. The algorithm has long been criticized for recommending conspiracy videos rife with misinformation and videos featuring minors, the latter of which prompted Senator Josh Hawley to propose legislation requiring YouTube to fix the problem. YouTube responded by pledging last year to not recommend "borderline" videos that come close to violating community guidelines or those which "misinform users in a harmful way." However, as TheirTube is trying to demonstrate, YouTube bubbles still occur.