Latest in Gear

Image credit: ASSOCIATED PRESS

Facebook says only 4,000 users viewed original NZ shooter livestream

Users tried to upload 1.5 million copies afterwards, but most were blocked.
299 Shares
Share
Tweet
Share
Save

Sponsored Links

ASSOCIATED PRESS

Facebook says a total of 4,000 people viewed the New Zealand mosque shooter's livestream before it was taken down. Less than 200 people were watching during the assailant's live broadcast, according to the social network, none of who reported it. Facebook says the first user report came in 29 minutes after the 17-minute live video started, which was 12 minutes after the livestream ended. The stats form part of Facebook's latest update detailing its ongoing response to the sharing of NZ shooting posts. They reveal the small scale reach of the original broadcast but, as we know, that ultimately didn't stop it from being widely circulated around the web.

In the wake of the livestream, a version of the video surfaced on YouTube every second over the weekend. It was also shared to Reddit forums such as "r/watchpeopledie" and "r/Gore," both of which have since been banned. And Facebook itself scrambled to pull down 1.5 million videos of the incident in the first 24 hours.

Meanwhile, New Zealand ISPs including Vodafone, Spark and Vocus were forced to block access at the DNS level to websites that didn't respond to takedown requests. Together they cut off controversial messageboards such as 4chan and 8chan (where the shooter was a member and, according to Facebook he shared a link to a copy of the video hosted on a file-sharing site). Worse still, mainstream media like The Daily Mail and Sky News Australia ran excerpts from the shooter's Facebook livestream, forcing Sky New Zealand to pull the latter off air.

As usual, Facebook has been transparent in its response. But it's facing a chorus of condemnation from lawmakers worldwide, who've grown tired of its meek attempts at self-regulation. Germany has already set penalties for social media sites that fail to swiftly remove harmful content and the UK is following suit. Though Facebook is pumping more money and manpower into its moderation systems, this latest failure will only result in more scrutiny of its reviews process.

Comments to this article were open for a limited time only, and have now been closed.

Source: Facebook
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Share
299 Shares
Share
Tweet
Share
Save

Popular on Engadget

Sonos' portable smart speaker leaks in greater detail

Sonos' portable smart speaker leaks in greater detail

View
Kevin Smith is making a 'He-Man' anime series for Netflix

Kevin Smith is making a 'He-Man' anime series for Netflix

View
SpaceX Starman Roadster completes its first orbit around the Sun

SpaceX Starman Roadster completes its first orbit around the Sun

View
Ransomware attack in Texas targets local government agencies

Ransomware attack in Texas targets local government agencies

View
Tesla's relaunched solar power efforts include $50 panel rentals

Tesla's relaunched solar power efforts include $50 panel rentals

View

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr