YouTube is clamping down on the disturbing videos aimed at children on its service. The burgeoning genre depicts family-friendly characters (like Spider-Man and Frozen's Elsa) in violent and sexual scenarios. We reported on the clips (and inappropriate ads) earlier this year, and recent coverage in the The New York Times claimed the vids are evading filters on the YouTube Kids app. In August, YouTube enforced a policy that restricted creators from monetizing videos that make "inappropriate use of family friendly characters." The video platform's latest step will automatically block this content from its kids app, as initially reported by The Verge.
"We're in the process of implementing a new policy that age restricts this content in the YouTube main app when flagged," YouTube's director of policy Juniper Downs said in a statement. "Age-restricted content is automatically not allowed in YouTube Kids."
The move adds an additional human element to the policing process. Until now, Google has mainly relied on its algorithms to filter inappropriate content from its kids app. But, that obviously didn't do the trick. It also claims that dedicated human teams work around the clock to block disturbing videos. Google will be hoping that the age-gate feature combined with its existing security layers will stop the sinister clips from reaching children. Time may also be on its side, as it takes several days for a video from YouTube proper to reach YouTube Kids. The changes will go live within a few weeks.
It seems YouTube doesn't want to censor the content altogether. But, its latest safeguard will also impact its main service, where age-restricted videos are only accessible to signed-in users aged 18 and over. Meanwhile, creators of the clips will feel the heat from the lack of monetization.
Parents should also note that YouTube Kids already contains controls that let you block channels and turn search on and off -- there's also the relatively new kids profiles.