YouTube has long been criticized for its opaque recommendations algorithm. Some researchers have warned that the video app’s algorithmic suggestions too often gravitate toward conspiracy theories, which can be especially dangerous to younger users.
The company has promised improvements, but critics — like Mozilla — say they don’t have any way to evaluate the company’s claims of progress. Mozilla is hoping its latest project might change that: The browser maker introduced a new extension that allows YouTube users to “donate” their recommendations in the name of helping researchers and others better understand the video platform’s algorithm.
The browser extension, called RegretsReporter, is available for both Chrome and Firefox, and allows YouTube viewers to anonymously make their own video suggestions available to outside experts to study.
“As you browse YouTube, the extension will automatically send data about how much time you spend on the platform, without collecting any information about what you are watching or searching,” Mozilla’s VP of Advocacy Ashley Boyd writes in a blog post. Users will be encouraged to report troubling recommendations they see, as well as describe the type of content that lead up to it. Boyd notes that the data it collects will be anonymized and shared with outside experts “in a way that minimizes the risk of users being identified.”
The ultimate goal, according to Boyd, is to better understand what type of content leads YouTube to suggest content that’s violent, racist or conspiratorial, and identify any patterns that may trigger these recommendations. Mozilla will then share its findings publicly so that anyone can study it.
What’s less clear is just how useful this data will be, and how much interest there is in examining self-reported recommendations. To gain truly useful data, Mozilla will need a wide swath of users to opt in to sharing their YouTube viewing history, which may be unappealing despite the company’s promise to protect users’ privacy. There also doesn’t seem to be a good way for Mozilla to protect against users trying to game recommendations or purposely get sucked into a YouTube “rabbit hole.” (The company notes that people should continue to use YouTube as they normally would while using the extension.)
That said, if it is able to help researchers or journalists identify even a few patterns, that could go a long way toward helping YouTube make good on its promises to fix its recommendations.
The extension is far from the first time Mozilla has used its platform to call out YouTube and its algorithm. A Previous Mozilla project highlighted the video app’s supposed “recommendation bubbles.” The company has also met with Google employees and published its own recommendations on changes YouTube should make.