Twitter says it inadvertently ran ads on profiles containing CSAM

The social network is investigating the incident.

Tom Weller/DeFodi Images via Getty Images

Twitter is still having trouble curbing the spread of CSAM (child sexual abuse material). Insider has learned (subscription required) that Twitter inadvertently ran ads on profiles either selling or soliciting CSAM. In an email to marketers, the social network said it had suspended all ads on profiles, updated its detection systems, banned accounts that broke its rules and launched an investigation. Reuters notes Coca-Cola, Disney and NBCUniversal were some of the brands whose ads appeared next to the offending content.

Existing technology had already blocked over 91 percent of accounts like these, Twitter said. In its most recent transparency report, the company said it took action against 31 percent more CSAM-related accounts in the second half of 2021.

A Twitter spokesperson confirmed the incident and investigation in a statement. On top of existing work to catch CSAM, the company said it was ensuring it had the "right models, processes and products" to protect both advertisers and users.

The news is ill-timed for Twitter. It comes just weeks after The Verge reported that Twitter ditched efforts to build an OnlyFans clone over concerns it couldn't effectively catch CSAM and other forms of sexual abuse. It's also emerging as the social media continues to fight with Elon Musk over the fate of his potentially cancelled $44 billion acquisition. Musk has focused most of his objections on alleged misreporting of fake account data.

There's been an immediate financial impact as well. Reuters added that big names like Dyson and Mazda had either frozen their marketing campaigns or pulled ads from some areas on Twitter. More might be coming — Coca-Cola and Disney both said they considered the activity unacceptable, while NBCUniversal told Twitter to remove ads that ran against CSAM.