In the first three months of 2019, Google manually reviewed more than a million suspected "terrorist videos" on YouTube, Reuters reports. Of those reviewed, it deemed 90,000 violated its terrorism policy. That means somewhere around nine percent of the million videos were removed, suggesting that either the videos must be rather extreme to get cut, or the process that flags them for review is a bit of a catchall.
Google shared these numbers with a US House panel today. The company also said it has more than 10,000 people working on content review, and it spends hundreds of millions of dollars on those efforts annually.
Google, Facebook, Twitter and Microsoft have been asked to reveal their counter terrorism budgets, but putting a number on that proves to be complex. The "hundreds of millions of dollars" estimate from Google is the closest thing to an answer that we've seen so far.
The Christchurch shooting in New Zealand has added pressure for the social media platforms to monitor content. As we saw earlier this spring, a shooting video was added to YouTube at a rate of one per second in the weekend following that tragedy. Since then, Australia has created legislation to hold social media companies accountable for removing violent content, and the EU is considering laws that would require terrorist content be removed within one hour of notice. To meet such standards, Google will likely need a system that more accurately flags videos for manual review.
We reached out to Google, but the company had no comment.