In the 24 hours after the Christchurch shooting, Facebook removed 1.5 million videos worldwide, but more than a month later, footage was still circulating on the platform. Now, the company says its AI had a hard time detecting the footage because of the way in which it was filmed.
"This was a first-person shooter video, one where we have someone using a GoPro helmet with a camera focused from their perspective of shooting," Neil Potts, Facebook's public policy director, told British lawmakers. As Bloomberg reports, this violence was so unprecedented that the AI didn't know what to look for.
The company has since come under fire for failing to remove the videos fast enough, and the EU is considering legislation that could fine social media platforms that don't remove terrorist content within one hour of notification. This revelation proves platforms will need to bolster their AI detection systems in order to meet such goals.