The UK government wants to put Ofcom in charge of regulating social media. Digital secretary Nicky Morgan and home secretary Priti Patel said they were "minded" to appoint the watchdog due to its experience and "proven track record" overseeing the UK's media and telecommunications industries. It would also avoid regulatory fragmentation, Patel and Morgan said, and be quicker to set up than a new regulator. Ofcom will be granted new powers to carry out its expanded responsibilities, which will cover any platform that hosts user-generated content, including comments and forum posts. It's safe to assume that social media giants including Facebook, YouTube, Twitter and TikTok will be in its cross-hairs, then.
"We will give the regulator the powers it needs to lead the fight for an internet that remains vibrant and open but with the protections, accountability and transparency people deserve," Morgan said.
Ofcom has welcomed its hypothetical new role, too. "We share the government's ambition to keep people safe online and welcome that it is minded to appoint Ofcom as the online harms regulator," Jonathan Oxley, Ofcom's interim CEO said. "We will work with the government to help ensure that regulation provides effective protection for people online and, if appointed, will consider what voluntary steps can be taken in advance of legislation."
UK politicians want to introduce a so-called "duty of care" that will force social media companies to protect British users from harmful and illegal content. That includes anything related to bullying, harassment, terrorism, child abuse and grooming, gang culture, violence and fake news.
For now, it's unclear what fines or punishments Ofcom will be able to issue.
Today, the UK government confirmed that Ofcom's powers, and the regulation underpinning its work, will focus on "the wider systems and processes that platforms have in place to deal with online harms," instead of individual posts and takedown requests. Companies will be able to choose what legal content is allowed on their platform. They must clearly state what they consider acceptable, though, and remove any illegal content "expeditiously." "Reflecting the threat to national security and the physical safety of children, companies will be required to take particularly robust action to tackle terrorist content and online child sexual exploitation and abuse," the government said today. Companies will also be required to have systems that allow users to report harmful content and question takedowns.
For now, it's not clear how Ofcom will monitor and assess companies' systems. There's also no detail on the fines or punishments Ofcom will be able to issue. The UK government said it will publish further details about the regulator's new powers this spring. It stressed that it wants to "set the direction" through legislation but give Ofcom control over its specific processes and procedures. That way, it argued, the watchdog will be able to adapt to any technological change.
As part of today's announcement, Ofcom has announced that Dame Melanie, currently permanent secretary at the Ministry of Housing, Communities and Local Government, will become its new CEO in March. If Ofcom is formally appointed (for now, the government is only "minded" to do so), it will also have a responsibility to protect free speech, the role of the press, and technological innovation. The UK government stressed that Ofcom's new powers won't stop British adults from accessing and posting legal content, even if it's of a nature that some may find offensive online.
"New rules are needed so that we have a more common approach across platforms and private companies aren't making so many important decisions alone."
Facebook said it had "long called for new regulations" and looked forward to working with the UK government on the issue. "New rules are needed so that we have a more common approach across platforms and private companies aren't making so many important decisions alone," Rebecca Stimson, Facebook's head of UK public policy said. "This is a complex challenge as any new rules need to protect people from harm without undermining freedom of expression or the incredible benefits the internet has brought."
YouTube said it was already tackling the problem but welcomed Ofcom's expanded role and new regulation. "Our work has the most impact when companies, Government and communities work together," Ben McOwen Wilson, YouTube's UK managing director said. "We look forward to working in partnership with the Government and Ofcom to ensure a free, open and safer internet that works for everyone." A spokesperson for TikTok added: "Providing our users with a safe and positive environment in which they can express their creativity is our top priority. To that end, we are continuously evolving our own measures to further strengthen safety on TikTok. We look forward to working with government, wider industry and all relevant organisations to support our mutual objective of nurturing a safer online experience for everyone."
Today's decision is the government's "initial response" to an Online Harms White Paper that was announced by Home Secretary Sajid Javid in late 2018. A four-month consultation period ran last year, and the final paper was published in April. It recommended a new regulatory framework to better protect British citizens online, and and independent regulator that could set "clear safety standards, backed up by reporting requirements and effective enforcement powers." Ofcom wasn't named at the time, however, and some wondered whether the UK government would create an entirely-new regulator for the job.