A nationwide manhunt for Steve Stephens, the 37-year-old from Cleveland who uploaded a video to Facebook of himself shooting an elderly stranger in the head, came to an end today. Stephens committed suicide after a brief car chase with state police in Erie, Pennsylvania. His crime, which took place this past Sunday, sparked outrage not only because of the violence itself, but also the way Facebook handled the situation. It took the social network over two hours to take the video down, although it claims this was because it wasn't flagged immediately by other users. Facebook says Stephens' actions weren't reported until he used the Live feature to stream his murder confession, about an hour and 45 minutes after the shooting video was uploaded. His account has since been suspended.
"This is a horrific crime and we do not allow this kind of content on Facebook," the company said in a statement. "We work hard to keep a safe environment on Facebook, and are in touch with law enforcement in emergencies when there are direct threats to physical safety." As it stands, Facebook relies heavily on people flagging graphic content (the same way it does sketchy ads), which means individuals have to actually see something dreadful before they can flag it. As Wired reported earlier this year, Facebook has opted not to use algorithms to censor videos like this before they're posted, claiming that it doesn't want to be accused of violating freedom-of-speech rights. But, as these types of cases mushroom, the company may be forced to change its stance sooner than later.
It could be hard for the company to build an algorithm that can successfully tell the difference between a video of someone being murdered and a clip from, say, a Jason Bourne movie. But, according to Facebook VP of Operations Justin Osofsky, his team is constantly exploring new technologies that can help create a safer environment on the site. Osofsky pointed to artificial intelligence in particular, which he says is already helping prevent certain graphic videos from being reshared in their entirety. Facebook's explanation of this is confusing, though: It says people "are still able to share portions of the videos in order to condemn them or for public awareness, as many news outlets are doing in reporting the story online and on television."
The company didn't clarify how the feature works when we reached out, but it's clear a video like Stephens' should be removed completely and immediately. And, as a result of this weekend's events, Osofsky said Facebook is reviewing its reporting system to ensure that people can flag explicit videos and other content "as easily and as quickly as possible."
Unfortunately for Facebook, Stephens' case isn't the first time it has faced scrutiny over people using its tools to promote violence. Back in March, Chicago police charged a 14-year-old boy after he used Facebook Live to broadcast the sexual assault of a 15-year-old girl, which was just one of many gruesome clips that hit Facebook recently. Per The Wall Street Journal, more than 60 sensitive videos, including physical beatings, suicides and murders, have been streamed on Facebook Live since it launched to the public last year. This begs the question: Should the Federal Communications Commission regulate social networks the way it does TV? In 2015, former FCC Chairman Tom Wheeler said there were no plans to do so, claiming he wasn't sure the agency's authority extended to "picking and choosing among websites."
The FCC, now headed by Ajit Pai under President Donald Trump, did not respond to our request for comment on the matter. That said, a source inside a major video-streaming company thinks services such as Facebook Live, Periscope and YouTube Live would benefit from having a "delay" safeguard in place. This could be similar in practice to how TV networks handle live events, which always feature a seven-second delay in case something unexpected happens. Remember when Justin Timberlake uncovered Janet Jackson's nipple during the Super Bowl 38 halftime show in 2004? This delay system is designed to prevent scenes like those from showing up on your TV.
"Facebook has really jumped very quickly into the video space, which is exciting, but it's taking a fail-fast approach to it," the source, who asked to remain anonymous, said. "In the desire to push Live out to as many people as possible, there were a lot of corners that were cut. And when you take a fail-fast approach to something like live-streaming video, it's not surprising that you come across these scenarios in which you have these huge ethical dilemmas of streaming a murder, sexual violence or something else."
As for why individuals are using these platforms to broadcasts their heinous acts, Janis L. Whitlock, a research scientist at Cornell University's College of Human Ecology, says it's hard to pinpoint the reason because there's no way you can do an experimental control. She says there's a good chance Stephens was struggling with a mental illness and saw his victim, 74-year-old Robert Godwin Sr., as an object in an ongoing fantasy. Whitlock says that while there's a good side to these social networks, they also tend to bring out the worst in people, especially those who are craving attention: "They make the most ugly of us, the most ugly in us, visible."
"The fact that you can have witnesses, like billions of people witness something in a tiny period of time, it has to have an enormous impact on the human psyche," she says. "How does that interact with the things that people do, or choose not to do? We don't know yet, but it does, absolutely. I have no doubt about that as a psychologist." Whitlock says companies like Facebook must start taking some civic responsibility, adding that there needs to be a conversation between it and other internet giants about how their products "interact with who humans are" and how they can expose someone's limitations and potentials.
"How is it that we can use and structure these things to really amplify all the ways in which we're amazing," Whitlock says, "and not the ways in which we're disgusting?"