Sponsored Links

YouTube cracks down on QAnon videos that target individuals or groups

The platform is banning content that can be used to justify real world violence.
A worker walks past YouTube offices, in King's Cross, London, Britain, September 11, 2020. REUTERS/Toby Melville
Toby Melville / reuters
Chris Velazco
Chris Velazco|@chrisvelazco|October 15, 2020 11:22 AM

With just weeks to go before the 2020 presidential election, YouTube has confirmed it is expanding efforts to crack down on harmful conspiracies being shared on its platform, with a specific focus on prohibiting "content that targets an individual or group with conspiracy theories that have been used to justify real-world violence."

"One example would be content that threatens or harrasses someone by suggesting they are complicit in one of these harmful conspiracies, such as QAnon or Pizzagate," the company wrote in a blog post.

For the unaware, QAnon originally centered around cryptic posts shared on 4chan by an individual with the tripcode “Q Clearance Patriot.” Over time, the account jumped from 4chan to 8chan, and from there it began to enter the mainstream. Before long, Q’s cadre of followers began to metamorphose into a loosely-connected alliance of far-right conspiracy theorists who allege — among other things — that President Trump is facing down a shadowy cabal of Satanists and pedophiles who operate sex-trafficking rings.

Turn on browser notifications to receive breaking news alerts from Engadget
You can disable notifications at any time in your settings menu.
Not now

While YouTube's statement notes that the company has already removed hundreds of channels and thousands of QAnon-related videos, it did not offer guidance on how many more would be affected as a result of this shift in policy.

It’s also worth noting that today's move is not quite as drastic as those taken by other major Silicon Valley players. Last week, Facebook expanded on earlier efforts to ban QAnon pages and groups that discussed violence by banning QAnon-centric accounts entirely. (Individual users, however, are still allowed to post QAnon content to their accounts.) And over the summer, Twitter began banning thousands of QAnon accounts, preventing even more from appearing in users' recommendations, and blocking URLs associated with QAnon content from being shared on the platform.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission. All prices are correct at the time of publishing.