Latest in Gear

Image credit:

Facebook says only 4,000 users viewed original NZ shooter livestream

Users tried to upload 1.5 million copies afterwards, but most were blocked.
Saqib Shah, @eightiethmnt
March 19, 2019
Share
Tweet
Share

Sponsored Links

ASSOCIATED PRESS

Facebook says a total of 4,000 people viewed the New Zealand mosque shooter's livestream before it was taken down. Less than 200 people were watching during the assailant's live broadcast, according to the social network, none of who reported it. Facebook says the first user report came in 29 minutes after the 17-minute live video started, which was 12 minutes after the livestream ended. The stats form part of Facebook's latest update detailing its ongoing response to the sharing of NZ shooting posts. They reveal the small scale reach of the original broadcast but, as we know, that ultimately didn't stop it from being widely circulated around the web.

In the wake of the livestream, a version of the video surfaced on YouTube every second over the weekend. It was also shared to Reddit forums such as "r/watchpeopledie" and "r/Gore," both of which have since been banned. And Facebook itself scrambled to pull down 1.5 million videos of the incident in the first 24 hours.

Meanwhile, New Zealand ISPs including Vodafone, Spark and Vocus were forced to block access at the DNS level to websites that didn't respond to takedown requests. Together they cut off controversial messageboards such as 4chan and 8chan (where the shooter was a member and, according to Facebook he shared a link to a copy of the video hosted on a file-sharing site). Worse still, mainstream media like The Daily Mail and Sky News Australia ran excerpts from the shooter's Facebook livestream, forcing Sky New Zealand to pull the latter off air.

As usual, Facebook has been transparent in its response. But it's facing a chorus of condemnation from lawmakers worldwide, who've grown tired of its meek attempts at self-regulation. Germany has already set penalties for social media sites that fail to swiftly remove harmful content and the UK is following suit. Though Facebook is pumping more money and manpower into its moderation systems, this latest failure will only result in more scrutiny of its reviews process.

Comments to this article were open for a limited time only, and have now been closed.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Share
Tweet
Share

Popular on Engadget

The 2020 Engadget Holiday Gift Guide

The 2020 Engadget Holiday Gift Guide

View
The Arecibo Observatory's telescope has collapsed

The Arecibo Observatory's telescope has collapsed

View
The second-gen Eve V may take on the Surface Pro again in 2021

The second-gen Eve V may take on the Surface Pro again in 2021

View
Watch the trailer for Studio Ghibli's first fully CG movie

Watch the trailer for Studio Ghibli's first fully CG movie

View
The Snapdragon 888 is Qualcomm's latest premium CPU for smartphones

The Snapdragon 888 is Qualcomm's latest premium CPU for smartphones

View

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr