Internet giants have been racing to pull copies of the New Zealand mass shooter's video from their sites, and Facebook is illustrating just how difficult that task has been. Facebook New Zealand's Mia Garlick has revealed that the social network removed 1.5 million attack videos worldwide in the first 24 hours, 1.2 million of which were stopped at the upload stage. This includes versions edited to remove the graphic footage of the shootings, Garlick said, as the company wants to both respect people affected by the murders and the "concerns of local authorities."
The shooter wore a head-mounted camera to livestream the attack, and in his social posts and manifesto suggested that he wanted to stoke tensions that supported his anti-immigrant agenda. There's been a concern that spreading the video is giving the attacker the exposure he wanted in addition to traumatizing viewers.
While the removal rate suggests that Facebook is having some success pulling the video, it also shows how difficult it can be to contain the spread of material like this. About 300,000 copies made it to the site before they were taken down, after all. The company's existing mix of automated and human moderation can only do so much when it's relatively easy to upload a video or make edits that can bypass content filters. Facebook and other internet firms may end up reviewing their methods to see if they can speed up video removals and make it harder to spread footage like this in the first place.
In the first 24 hours we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload...— Facebook Newsroom (@fbnewsroom) March 17, 2019