Sponsored Links

UK will hold social networks accountable for harmful content

They could face fines for failing to protect users.
Saqib Shah
Saqib Shah|@eightiethmnt|April 8, 2019 8:25 AM

The UK government plans to penalize tech companies like Facebook and Google that fail to curb the spread of harmful content on their platforms. As promised, the country is seeking to empower an independent regulator to enforce the rules which target violent material, posts encouraging suicide, disinformation, cyber-bullying, and child exploitation. Over the coming weeks, the government will consult on the types of punishments available to the new watchdog, including fines, blocking access to sites, and holding senior members of tech companies accountable for their failures.

Both Facebook and Google have previously denied responsibility for the content published on their sites, evoking the communications act in the US to overcome lawsuits accusing them of enabling terrorism and spreading extremist views. But calls for big tech to be regulated have grown in recent years following a spate of controversial incidents, the most recent of which was the live-streaming of the mass shooting in New Zealand on Facebook.

Google, meanwhile, has been called out for the spread of conspiracy theories on YouTube. And Twitter has long grappled with toxic abuse on its site. Execs from all three companies have also appeared before Congress in relation to Russian activity on their respective platforms during the 2016 US election.

The new measures form part of the "Online Harms White Paper," a joint proposal from the UK's Department for Digital, Culture, Media and Sport (DCMS) and the Home Office, and have received the blessing of Prime Minister Theresa May.

Turn on browser notifications to receive breaking news alerts from Engadget
You can disable notifications at any time in your settings menu.
Not now

"The internet can be brilliant at connecting people across the world - but for too long these companies have not done enough to protect users, especially children and young people, from harmful content," said May in a statement. "That is not good enough, and it is time to do things differently. We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe."

Earlier this year, the DCMS referred to Facebook's senior management as "digital gangsters" in its report on fake news online. It added that CEO Mark Zuckerberg had shown "wilful contempt" toward the UK parliament by twice failing to appear before the committee. Facebook-owned Instagram was also recently forced to blur self-harm images on its app in the UK following the suicide of British schoolgirl Molly Russell. Her parents said her death came as a result of viewing images of self-harm on Instagram and Pinterest.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.
UK will hold social networks accountable for harmful content