YouTube's AI can automatically age-restrict inappropriate videos
The company will also make it more difficult for users to skirt those restrictions.
YouTube already uses machine learning to flag inappropriate content for its Trust and Safety team to review, but moving forward, the company plans to expand the use of the technology. Starting later this year, YouTube will use machine learning to add age restrictions to inappropriate videos automatically.
As part of the same initiative, YouTube plans to make it more difficult for children to skirt those restrictions. If your child tries to watch a restricted video through an embed on another website, Google will redirect them to YouTube, where they'll need to sign in to prove they're over 18. "This will help ensure that, no matter where a video is discovered, it will only be viewable by the appropriate audience," the company said.
Google says people who upload videos its algorithm restricts will have the chance to appeal those decisions. It also notes the additional automation won't have a significant effect on content creators' revenue.
Today's announcement is Google's latest move to make YouTube look more responsible to parents and their children after it was marred by controversy. In 2019, the company paid $170 million to settle allegations it had illegally collected data from kids who had watched the video streaming service. Since then, it's rolled out features like a web version of its YouTube Kids app to try and address those concerns.