Latest in Gear

Image credit: Olly Curtis/Future via Getty Images

YouTube's channel removals soar following hate speech crackdown

Comment removals nearly doubled after the June policy change.
389 Shares
Share
Tweet
Share
Save

Sponsored Links

Olly Curtis/Future via Getty Images

YouTube's tougher stance on hate speech has led it to culling much, much more conent than it has in the past. The Google-owned video service has revealed that it removed over 17,000 channels and 100,000 videos for hate speech, both fivefold increases over its previous activity. It "nearly doubled" the volume of comment removals, too, to more than 500 million. The jumps were partly due to the removal of material that had previously been allowed, but they still suggest that YouTube's enforcement is proving more effective.

Content overseers are apparently moving quickly, too. The nearly 30,000 videos pulled for hate speech in the past month produced only three percent of the views that knitting videos did.

The service also used the news as an opportunity to boast about its automated video flagging. Machine learning was the first to catch 87 percent of the 9 million videos removed in the second quarter of 2019, and more than 80 of automatically flagged videos were removed before anyone saw them. Improved spam detection saw a 50 percent jump in the number of channels pulled for violating relevant policies.

The company did acknowledge the trickiness of applying AI to hate speech, though. It noted that categories like that are "highly dependent on context" and need human review to make appropriate decisions. YouTube's crackdown on hate speech inadvertently pulled history videos and channels that were merely educating people on Nazis and World War II, for example -- clearly, the technology wasn't clever enough to make the distinction.

The achievements won't please everyone. In addition to those videos that might slip through the cracks, there have been concerns that YouTube might be stifling free expression with the aggressive takedowns. Critics have asserted that allowing hateful videos puts the perpetrators out in the open where they can be challenged and, ideally, steered away from their extreme views. However, YouTube and others don't see it that way -- they're worried about the potential for radicalization, not to mention the possibility these videos will show up in recommendations attached to more innocuous clips. YouTube doesn't want to take any chances, and that means pulling content en masse.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Share
389 Shares
Share
Tweet
Share
Save

Popular on Engadget

Engadget's Guide to Privacy

Engadget's Guide to Privacy

View
Three Mile Island's infamous nuclear plant shuts down after 45 years

Three Mile Island's infamous nuclear plant shuts down after 45 years

View
Samsung asks users to be extra careful with the Galaxy Fold

Samsung asks users to be extra careful with the Galaxy Fold

View
Uber sues NYC over vehicle caps

Uber sues NYC over vehicle caps

View
Australia will help NASA go to the Moon and Mars

Australia will help NASA go to the Moon and Mars

View

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr