Why you can trust us

Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

Facebook vows to improve AI detection of terrorist videos

The platform explained why it wasn't able to detect the NZ shooter's live broadcast.

Facebook rushed to pull down footage of the New Zealand mass shooter's video from its platform, but it didn't start doing so until after the live broadcast was done. In a new post, Facebook VP of Integrity Guy Rosen discussed the company's successes and shortcomings in addressing the situation, as well as its plans to prevent videos like that from spreading on the social network in the future.

He explained that while the platform's AI can quickly detect videos containing suicidal or harmful acts, the shooter's stream didn't trigger it. To be able to train the matching AI to detect that specific type of content, the platform needs big volumes of training data. As Facebook explains, something like that is difficult to obtain as "these events are thankfully rare." Also, none of those who've watched the live broadcast reported it -- the first user report came in 29 minutes after the broadcast began and 12 minutes after the live ended. To be fair, however, the live was only viewed fewer than 200 times, while the original video was watched 4,000 times overall.

Rosen also explained why over 300,000 copies were able to circulate on the platform after Facebook's system already detected and removed 1.2 million copies of the video upon upload. He said there was a "core community of bad actors" that continually re-uploaded edited versions of the video. By tweaking it a bit and not uploading an identical copy of the original, they were able to circumvent the platform's filters. Some even played the original on their computers and then recorded it on their phones. In all, Facebook was able to detect over 800 variants of the video, each one visually distinct.

To be able to prevent similar videos from circulating in the future, Facebook plans to improve its matching AI by giving it audio-based detection powers, among other things. It also needs to ensure the AI can clearly differentiate between similar content and livestreamed video games. In addition, Facebook is exploring more ways on how it can use AI to detect live broadcasts like that much faster, as well as how it can address user reports more quickly.