YouTube's channel removals soar following hate speech crackdown

Comment removals nearly doubled after the June policy change.

Sponsored Links

Jon Fingas
September 3, 2019 11:05 AM
Olly Curtis/Future via Getty Images
Olly Curtis/Future via Getty Images

YouTube's tougher stance on hate speech has led it to culling much, much more conent than it has in the past. The Google-owned video service has revealed that it removed over 17,000 channels and 100,000 videos for hate speech, both fivefold increases over its previous activity. It "nearly doubled" the volume of comment removals, too, to more than 500 million. The jumps were partly due to the removal of material that had previously been allowed, but they still suggest that YouTube's enforcement is proving more effective.

Content overseers are apparently moving quickly, too. The nearly 30,000 videos pulled for hate speech in the past month produced only three percent of the views that knitting videos did.

The service also used the news as an opportunity to boast about its automated video flagging. Machine learning was the first to catch 87 percent of the 9 million videos removed in the second quarter of 2019, and more than 80 of automatically flagged videos were removed before anyone saw them. Improved spam detection saw a 50 percent jump in the number of channels pulled for violating relevant policies.

The company did acknowledge the trickiness of applying AI to hate speech, though. It noted that categories like that are "highly dependent on context" and need human review to make appropriate decisions. YouTube's crackdown on hate speech inadvertently pulled history videos and channels that were merely educating people on Nazis and World War II, for example -- clearly, the technology wasn't clever enough to make the distinction.

Turn on browser notifications to receive breaking news alerts from Engadget
You can disable notifications at any time in your settings menu.
Not now

The achievements won't please everyone. In addition to those videos that might slip through the cracks, there have been concerns that YouTube might be stifling free expression with the aggressive takedowns. Critics have asserted that allowing hateful videos puts the perpetrators out in the open where they can be challenged and, ideally, steered away from their extreme views. However, YouTube and others don't see it that way -- they're worried about the potential for radicalization, not to mention the possibility these videos will show up in recommendations attached to more innocuous clips. YouTube doesn't want to take any chances, and that means pulling content en masse.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.