Advertisement

Germany can fine Facebook up to $57 million over hate speech

Digital rights activists worry that the new law will curb free speech.

Germany has passed a contentious law allowing fines of up to €50 million ($57 million) for social networks like Facebook and Twitter if they don't pull hate speech down quickly enough. Called the Network Enforcement Act or "Facebook Law," it was passed by Germany's parliament on Friday, and will go into effect starting this October.

Facebook, for one, doesn't think the law will help. "We believe the best solutions will be found when government, civil society and industry work together and that this law as it stands now will not improve efforts to tackle this important societal problem," Facebook told Engadget via an email statement. "We feel that the lack of scrutiny and consultation do not do justice to the importance of the subject. We will continue to do everything we can to ensure safety for the people on our platform."

Social networks could be fined for failing to remove material that his "clearly criminal" within 24 hours. Depending on how severe the content is deemed, that could rise to €50 million ($57 million). For content where the legality is less clear, social networks have up to seven days to pull it down.

That covers a lot of potential posts, because Germany has some of the world's toughest laws around defamation, hate crimes involving public incitement and threats of violence. That includes prison sentences for Holocaust denial and inciting hatred against minorities.

In a speech before the parliament, Justice Minister Heiko Maas pointed out that hate crimes in Germany have tripled in the past two years. "It is about a principled decision decision for the digital age," he said. "Freedom of expression ends where criminal law begins ... we must finally enforce rights and laws on the internet."

Freedom of expression ends where criminal law begins ... we must finally enforce rights and laws on the internet.

Haas asserted that the law would not infringe on free speech, but critics aren't so sure. When the law was proposed, Facebook said it shifted the burden of law enforcement "from public authorities to private companies" and that it would provide an incentive for social networks to "delete content that is not clearly illegal." The company added that "several legal experts have assessed the draft law as being against the German constitution and non-compliant with EU law."

Other critics like Thorsten Benner in Handelsblatt also criticized the new rules, calling them "woefully misguided," and saying "private companies ... will become judges over complex free-speech issues." However, 70 percent of Germans backs the rule, he added, because "appearing 'tough' on US social media behemoths is popular with the public."

It didn't help that Facebook, Twitter and Google struck a deal with Germany in 2015 to pull down hate speech and other content, then failed to meet that commitment, according to a government report. In one instance, Facebook delayed pulling a post on a far right group's page that targeted Jewish people and businesses, resulting in telephone threats to at least one of them. It later apologized, pulled the post and acknowledged that the content was hate speech.

In addition, Facebook and Google recently pledged to work harder to fight hatred or the promotion of terrorism, and Facebook said it would hire an extra 3,000 employees to do so. However, a ProPublica report showed that Facebook's rules can, under certain conditions, allow for inciting violence against groups of people, something that's a criminal offense in Germany.

Maas wants a similar rule implemented across the EU, and the European Council recently approved new laws that would force websites to block videos that contain hate speech or incitement to terrorism. However, those laws still have to be passed by the EU parliament. Facebook's full statement is below.

We share the goal of the German government to fight hate-speech. We have been working hard on this problem and have made substantial progress in removing illegal content. We have recently announced that we will be adding 3,000 people to our community operations team, on top of the 4,500 we have today.

We're also building better tools to keep our community safe and make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact the police if someone needs help.

We believe the best solutions will be found when government, civil society and industry work together and that this law as it stands now will not improve efforts to tackle this important societal problem. We feel that the lack of scrutiny and consultation do not do justice to the importance of the subject. We will continue to do everything we can to ensure safety for the people on our platform.