YouTube is taking far-reaching actions to prevent a repeat of the pedophilic comments that plagued videos on the service. It has disabled comments on "tens of millions" of clips with young children in them over the past week, and plans a "broadening" of this action in the months ahead for both videos with young kids as well as those with older kids that "could be at risk of attracting predatory behavior."
The serivce also expects to detect more of these clips going forward. It has "accelerated" the rollout of an AI-based comment classification system that will be "more sweeping in scope" and promises to remove twice as many individual comments as before.
It's an aggressive move, but YouTube was under pressure to do something. In addition to the uproar over the company once again dealing with predatory comments, advertisers like AT&T, Disney and Epic Games have pulled or suspended their ads until YouTube cleaned up its act. If YouTube didn't make an effort to prevent the ad hoc child porn rings that were emerging, it risked alienating both users and ad dollars.