Advertisement

Twitter and YouTube wouldn't delete an extremist cleric's posts (update: gone)

They only sometimes gave policy as a reason, even after his arrest.

Adrian Dennis/AFP/Getty Images

Internet giants have been increasingly willing to take down extremist content, but their previous reluctance is coming back to haunt them. The UK recently convicted radical cleric Anjem Choudary (and co-defendant Mohammed Rahman) of rallying support for ISIS, and court documents have revealed that neither Twitter nor YouTube agreed to take down key content. Twitter hasn't deleted his account, for example, despite British law enforcement's claims that it violates Twitter policies on promoting terrorism -- even after he was arrested in September 2014. It pulled Rahman's, but not in sync with an official request.

YouTube has sometimes pulled videos, but not always. It wouldn't yank one Choudary clip because it was "journalistic" (it was posted at a research institute), while only some of Rahman's content went down. One of his stayed online under claims that it fostered "religious debate."

We've reached out to both Twitter and YouTube for their take on the situation, although authorities mentioned in the documents that they didn't have the authority to make either site take the extremist material down. The big question is whether or not the sites would react differently now. Google, Twitter and others have taken a more aggressive stance in fighting pro-terrorist content in recent months, even since the last Twitter takedown request in March 2016. It wouldn't be surprising if they pulled a lot more of the offending online content in the current climate.

Update: Choudary's Twitter account has disappeared following news reports. Also, YouTube reiterated its policies, which have it pulling pro-terrorist content unless there's a "clear news or documentary purpose." You can read YouTube's full statement below.

"We have clear policies prohibiting terrorist recruitment and content intending to incite violence, and quickly remove videos violating these policies when flagged by our users. We also terminate accounts run by terrorist organisations or those that repeatedly violate our policies. We allow videos posted with a clear news or documentary purpose to remain on YouTube, applying warnings and age-restrictions as appropriate."