Europe asks social networks to remove terrorist content within an hour

The European Commission published new guidelines today.

The European Commission published new guidelines for social networks today and among them is a request for these sites to remove reported terrorist content within one hour. In 2016, the Commission called for Facebook, Twitter, YouTube and Microsoft to put a more concerted effort into removing hate speech from their platforms and since then, it has been fairly pleased with the companies' improvements. Last June, the four companies together were able to review 51 percent of hate speech reports within 24 hours and in January, the Commission reported that their rate had bumped up to 81 percent. But the Commission is concerned about terrorist content in particular and is now asking these companies for an even quicker turnaround when reviewing this type of material.

"Online platforms are becoming people's main gateway to information, so they have a responsibility to provide a secure environment for their users," Andrus Ansip, VP for the Digital Single Market, said in a statement. "What is illegal offline is also illegal online. While several platforms have been removing more illegal content than ever before -- showing that self-regulation can work -- we still need to react faster against terrorist propaganda and other illegal content which is a serious threat to our citizens' security, safety and fundamental rights."

In the newly published guidelines, the Commission requests that terrorist content be reviewed and removed within one hour of it being reported because it "is the most harmful in the first hours of its appearance online." The Commission also asks social networks to implement better automated detection so that it doesn't have to rely on reports as much. Additionally, for terrorism and other illegal content -- including incitement of hatred and violence, child sexual abuse material, counterfeit products and copyright infringement -- the Commission requests that more efficient tools be developed for its detection, that those tools be shared throughout the industry and that all platforms work more closely with law enforcement.

The Commission says that it will continue to review social networks' performances regarding these new guidelines and will determine later on if additional steps, such as legislation, will be needed.