Latest in Gear

Image credit: Prykhodov via Getty Images

Court says Facebook not to blame for Israeli terror incident

And the decision might have bigger implications for social networks.
297 Shares
Share
Tweet
Share
Save
Prykhodov via Getty Images

Facebook is currently the defendant in several lawsuits accusing the social network of enabling terrorism and propagation of extremist views. Now, one of those cases has finally reached a resolution, and it has the potential to affect the court's decision for all the other lawsuits. According to the documents The Verge got its hands on, a federal court in the Eastern District of New York has dismissed a lawsuit that sought to hold Facebook legally responsible for the death of five people killed by Palestinian terrorist attacks in Israel back in 2015.

Their families sued the social network last year, claiming that the website "played an essential role in Hamas' (the terrorist organization) ability to carry out its terrorist activities." Facebook, they said, made it easier for the perpetrators to "communicate, recruit members, plan and carry out attacks, and strike fear in its enemies." They asked for $1 billion in damages and for the company to stop providing its services to terrorists.

The court, however, decided that Section 230 of the Communications Decency Act grants Facebook immunity from lawsuits like this. That section states that services like Facebook can't be held responsible for their users' actions. The decision reads:

"While the Force Plaintiffs attempt to cast their claims as content-neutral, even the most generous reading of their allegations places them squarely within the coverage of Section 230's grant of immunity. In their opposition to the present motion, the Force Plaintiffs argue that their claims seek to hold Facebook liable for "provision of services" to Hamas in the form of account access "coupled with Facebook's refusal to use available resources... to identify and shut down Hamas accounts."

While superficially content-neutral, this attempt to draw a narrow distinction between policing accounts and policing content must ultimately be rejected. Facebook's choices as to who may use its platform are inherently bound up in its decisions as to what may be said on its platform, and so liability imposed based on its failure to remove users would equally "derive from [Facebook's] status or conduct as a 'publisher or speaker.'"

In a statement sent to The Verge, Facebook said:

"We appreciate the court's consideration on this matter. Our Community Standards make clear that there is no place on Facebook for groups that engage in terrorist activity or for content that expresses support for such activity, and we take swift action to remove this content when it's reported to us. We sympathize with the victims and their families."

Facebook isn't the only tech titan in the midst of legal battles related to extremist activities on their websites. Late last year, the families of Pulse nightclub shooting victims sued both Facebook and Google for providing "material support" to the gunman who pledged allegiance to ISIS. Prior to that, the wife of one of the victims who died during a shooting incident in Jordan sued Twitter for allowing ISIS activity to spread on its website. More recently, the relatives of the San Bernardino shooting victims also filed a lawsuit against Twitter, Facebook and Google for letting terrorist activity flourish on their platforms.

From around the web

ear iconeye icontext filevr