YouTube leaders ignored proposals to alter recommendations to stamp out toxic videos and to tackle conspiracy theories, several former and current employees told Bloomberg. Executives were more concerned with keeping viewers engaged, according to the report.
"Scores" of YouTube and Google employees noted their concerns about the "mass of false, incendiary and toxic content" over the last few years. Many suggested changes to YouTube executives or tracked the prevalence and popularity of toxic videos to show senior management the extent of the problem. However, the word reportedly came from on high to stay the course in the hopes of avoiding a dip in engagement metrics.
YouTube employees have wrestled with executives' approach to the platform's problems. Five senior staff members who left the company over the last couple of years claimed YouTube's long-standing "inability to tame extreme, disturbing videos" was why they departed.
Still, outlandish content can draw attention, potentially leading to significant advertising revenue. Parent company Alphabet typically doesn't disclose YouTube revenue numbers in earnings reports, though the video-sharing platform is estimated to pull in north of $16 billion per year.
A spokeswoman refuted some of the report's claims, including that CEO Susan Wojcicki "is inattentive to these issues and that the company prioritizes engagement above all else," according to Bloomberg.
The service has been focused on finding solutions for some of its "toughest content challenges" over the last two years, a spokesperson told Engadget in a statement. Those measures include "updating our recommendations system to prevent the spread of harmful misinformation, improving the news experience on YouTube, bringing the number of people focused on content issues across Google to 10,000, investing in machine learning to be able to more quickly find and remove violative content, and reviewing and updating our policies."
Yet several years before YouTube pledged to stop recommending conspiracy videos, a privacy engineer suggested that videos skirting the edges of the site's content policies shouldn't be included in recommendations. His proposal was rejected at the time, according to the report, though YouTube eventually adopted the idea this January. YouTube has added other measures to combat false content, including information panels on video pages and search results that offer truthful information on sensitive or controversial topics.
Meanwhile, Motherboard reports white nationalist and neo-Nazi propaganda videos are freely available on YouTube. The publication shared some examples with the service, which removed advertising from them, added content warnings and made sure they didn't appear in recommendations. But the videos are still on YouTube, and you can view them via search results.