If anything, the researchers found, YouTube has the opposite problem -- it's overly safe. The recommendations tend to clearly favor mainstream channels, including outlets labeled as partisan like Fox News and John Oliver's Last Week Tonight. Independents and smaller outlets like The Young Turks tend to suffer regardless of their political leanings. Ledwich also warned that the recommendation algorithm tends to promote "filter bubbles" where people rarely see videos that challenge their views.
The study also indicated that the suggestions tended to favor centrist and left-wing channels. It's not surprising that social justice-oriented videos would lead to those camps, but even a portion of those channels devoted to conspiracies and social conservatism would point users to the center and left. The researchers didn't accuse YouTube of an anti-conservative bias, but it was definitely harder to see suggestions for right-wing videos if you weren't already inclined that way.
Ledwich and Zaitsev argued that this shoots down theories that YouTube spreads radicalization like a virus. Instead, they floated a theory from Kevin Munger and Joseph Philips that there's a "supply and demand" model -- if more people are viewing extremist content, that's because more channels are surfacing to meet existing interest.
This isn't a definitive conclusion on radicalization for YouTube, let alone the internet as a whole. It doesn't really address terrorism, for example. However, it does challenge assumptions that YouTube's current recommendation model allows extreme ideas to spread in a significant way.