A group of UK MPs is calling on the government to introduce "meaningful fines" for technology companies which fail to remove illegal content "within a strict timeframe." The report, published by the Commons home affairs committee, slams Google, Facebook and Twitter's efforts to curb the spread of hate, abuse and extremism online. While it praises their commitment to removing harmful content, it says "nowhere near enough" is being done to tackle the problem. "There are too many examples of social media companies being made of aware of illegal material yet failing to remove it, or to do so in a timely way," the report reads.
The committee has recommended that the government consult on "a system of escalating sanctions" that would culminate in fines for social media companies which fail to act swiftly. It would be similar to Germany, which is considering new rules that include fines up to 50 million euros (54.5 million dollars). In addition, the group has suggested that Facebook, Google and Twitter pay for the monitoring and investigation work carried out by the Metropolitan Police on their platforms. British football clubs already pay for the policing around their stadiums, the MPs argue, so it makes sense for technology companies to do the same online.
The group points to effective tools that allow technology companies -- YouTube specifically -- to identify and, where necessary, take down copyright infringing content. That same level of early, proactive identification should be possible for videos expressing hateful and extremist views, they argue. Relying on user-submitted reports isn't sufficient either. "They are, in effect, outsourcing the vast bulk of their safeguarding responsibilities at zero expense," the committee says. "We believe that it is unacceptable that social media companies are not taking greater responsibility for identifying illegal content themselves."
Google, Facebook and Twitter want algorithms, rather than people, to sift through the content posted by their users. The group argues that while they may be helpful, human judgement will always be needed for complex cases, and that it's disappointed YouTube only uses its algorithms to help advertisers. It's also called on the three companies to publish quarterly reports detailing their efforts to tackle the issue. They should include the number of reports received from users, how the company responded to them and the actions taken to prevent such content in the future.
"It is in everyone's interest, including the social media companies themselves, to find ways to reduce pernicious and illegal material," the group says. "Transparent performance reports, published regularly, would be an effective method to drive up standards radically, and we hope it would also encourage competition between platforms to find innovative solutions to these persistent problems."