YouTube’s existing policies are not enough to discourage creators from posting “problematic” content. That’s one of the findings of new research from Cornell Tech on how YouTubers make money.
YouTube has long used the threat of demonetization to encourage creators to follow its rules. Creators who violate its policies or who veer into so-called borderline content — videos that don’t outright break the rules but come close enough the company stops recommending them — are at risk of losing access to monetization features.
But researchers at Cornell and the Swiss Federal Institute of Technology Lausanne found that demonetization may not always have the intended effect. That’s because it’s still exceedingly easy for creators who have been demonetized to direct viewers to other money-making platforms like Patreon.
Moreover, they found that YouTubers who traffic in extreme and “problematic” content are significantly more likely to employ “alternative monetization” sources than their peers. According to their findings, 61 percent of “fringe channels” used an alternative monetization source, compared with just 18 percent of channels overall.
At the same time, the researchers found that demonetizing a channel tends to result in creators producing more content — not less. And demonetization may even result in more divisive and extreme content because they are now trying to appeal to “committed audiences” rather than the general YouTube viewer.
“On the one hand, weakening the link between exposure and earnings may allow higher-quality content to be produced,” they write in the paper. “On the other, it may also encourage creators to embrace divisive rhetoric … Even if videos are demonetized by YouTube for breaching their policy, it could be that, due to alternative monetization strategies, creators still have substantial financial incentives to create content espousing false, hateful, and divisive narratives.”
The researchers say that platforms like Patreon, as well as lesser-known sites like SubscribeStar, need more scrutiny as they are becoming more popular across YouTube, not just with “problematic” streamers.
The issue of how to handle borderline content, and how far YouTube should go in discouraging it isn’t a new issue. Last month, YouTube’s Chief Product Officer Neal Mahon said the company was wrestling with whether to take more aggressive steps to prevent problematic YouTube content from going viral on other platforms. One idea under consideration, he said, would be to “break” sharing on these videos so they can’t spread as easily.