Study says YouTube 'actively discourages' radicalism
It may be too eager to court the mainstream, however.
Politicians and others complain that YouTube fosters extremism, but how caustic is it, really? Not all that much, according to researchers. Data scientist Mark Ledwich and UC Berkley researcher Anna Zaitsev have published a study suggesting that YouTube "actively discourages" radicalism through its recommendation system. Their reviewers classified over 760 politics-oriented channels based on overall leaning, topics and proximity to the mainstream, and found that YouTube removed "almost all" suggestions for conspiracy theorists, white identitarians and "provocateurs" (read: purposefully offensive creators). For the most part, there's only a significant likelihood of being matched with questionable content if you're already watching that material.
If anything, the researchers found, YouTube has the opposite problem -- it's overly safe. The recommendations tend to clearly favor mainstream channels, including outlets labeled as partisan like Fox News and John Oliver's Last Week Tonight. Independents and smaller outlets like The Young Turks tend to suffer regardless of their political leanings. Ledwich also warned that the recommendation algorithm tends to promote "filter bubbles" where people rarely see videos that challenge their views.
The study also indicated that the suggestions tended to favor centrist and left-wing channels. It's not surprising that social justice-oriented videos would lead to those camps, but even a portion of those channels devoted to conspiracies and social conservatism would point users to the center and left. The researchers didn't accuse YouTube of an anti-conservative bias, but it was definitely harder to see suggestions for right-wing videos if you weren't already inclined that way.
Ledwich and Zaitsev argued that this shoots down theories that YouTube spreads radicalization like a virus. Instead, they floated a theory from Kevin Munger and Joseph Philips that there's a "supply and demand" model -- if more people are viewing extremist content, that's because more channels are surfacing to meet existing interest.
This isn't a definitive conclusion on radicalization for YouTube, let alone the internet as a whole. It doesn't really address terrorism, for example. However, it does challenge assumptions that YouTube's current recommendation model allows extreme ideas to spread in a significant way.