YouTube has removed more than 400 channels and disabled comments on "tens of millions of videos" over the last few days after reports suggested a child porn ring was persisting on the platform. In a comment on a video published by Philip DeFranco Wednesday, the service's creator relations team said YouTube's staff are working "incredibly hard to root out horrible behavior," and have "reported illegal comments to law enforcement."
UPDATE: @YouTube @YTCreators left a comment and provided an update on what they've done to combat horrible people on the site in the last 48 hours.— Philip DeFranco (@PhillyD) February 21, 2019
TLDR: Disabled comments on tens of millions of videos. Terminated over 400 channels. Reported illegal comments to law enforcement. pic.twitter.com/zFHFfkX9FD
The issue came to light in a video posted this week by YouTuber Mark Watson, who claimed he'd investigated "a soft-core pedophilia ring" facilitated by YouTube's recommendation algorithms. He said that in just a few clicks, he was able to access videos of children where pedophiles connected with each other in the comments and left predatory remarks. Some big-name advertisers, including Disney, Nestlé and Fortnite creator Epic Games, paused ad spending on the platform in the wake of Watson's video. YouTube has said it would refund those whose ads appeared next to affected videos.
As DeFranco pointed out, this is an issue YouTube has been actively combating for years. HP, candy maker Mars and grocery chain Lidl pulled their advertising in 2017 following reports of similar predatory activity in the comments on videos of kids. They later restored their ads after YouTube enacted additional measures aimed at stamping out child exploitation.