YouTube will reduce conspiracy theory recommendations in the UK

The platform began similar efforts in the US back in January.

YouTube plans to tweak its recommendation algorithm to cut back on conspiracy theory videos in the UK, eight months after it conducted a similar experiment in the US. The platform is in the middle of rolling out the update to its British users, a spokesperson confirmed to TechCrunch. It's unclear when exactly the change will occur.

Back in January, the platform said it would begin reducing what it deemed "borderline content", or videos that came close to -- but didn't quite -- violate YouTube's Community Guidelines and videos that misinformed people. The company listed Flat Earth, 9/11 and anti-vax conspiracy theories as some examples of content it would try to reduce.

It's unclear whether YouTube's efforts in the US are working. A Huffington Post investigation from July revealed that even though recommendations for conspiracy theories have been cut in half and some heavyweight distributors have been deplatformed, conspiracy theory videos are still thriving on the platform.

From David Icke to Paul Joseph Watson, the UK has its own troubled history with conspiracy theorists. Back in April, the UK government published an Online Harms Whitepaper that included numerous measures to tackle misinformation and online safety, including an independent watchdog to hold Big Tech companies accountable. But YouTube's managing director for the UK, Ben McOwen Wilson, has expressed concerns that government regulation of YouTube will result in censorship. "If they're government-appointed, that begins to look very much like censorship, and we don't launch in markets where that is a risk," said McOwen Wilson in an interview with the BBC.

Given the relative light touch YouTube has taken to the US recommendation engine, its perhaps unwise to expect a drastic change across the pond. Beginning in September 2020, the broadcasting regulator Ofcom will have the power to fine platforms like Facebook, Instagram and YouTube for exposing young people to harmful content. If YouTube's future efforts to rein in misinformation in the UK aren't effective, it will be up to regulators to take action.