Russian spam accounts are still a big problem for Reddit

At least 89 subs have been targeted by Russian propaganda.

This article was produced in partnership with Point, a YouTube channel for investigative journalism.

Last September, a Reddit user called DivestTrump uploaded a detailed report to Reddit about suspicious political posts that targeted the site's main pro-Trump discussion forum. The content chiefly originated from two websites, and, which have both been linked to Russia's Internet Research Agency and individuals under investigation by Robert Mueller's FBI probe.

But it turns out that's just the tip of the iceberg; tens of millions of reddit users could have been influenced.

In a joint investigation, Point and Engadget have learned that at least three additional domains —, and — are also targeting r/The_Donald and other conservative subreddits in similar ways.

"I've continued to hunt down Russian propaganda," said DivestTrump, who has requested anonymity to avoid online abuse. "It's in the tens of thousands of posts and thousands of users are spreading it ... it's incredibly pervasive."

These suspected Russian propaganda sites have been targeting at least 89 subreddits. In partnership with the data analytics company, Gravwell, we scrutinized the nature and scale of these posts. Our findings suggest a Russian-led attempt to antagonize and influence Americans online, which is still ongoing.

The total subscriber base of all 89 subreddits is in excess of 68 million registered users.

The majority of the subreddits were conservative or right-leaning in content, such as r/ConservativesOnly, r/the_donald or r/DrainTheSwamp. But some of the largest subreddits with posts from these domains are also non-partisan or left-leaning, such as r/worldnews, r/atheism or r/COMMUNISM.

At the time of writing, Reddit had banned six of the 89 subreddits, although two of those were banned because they didn't have any moderators and not because Reddit deemed that they had violated the rules of engagement. Additionally, Reddit has quarantined one of the subreddits, r/911truth, which now comes with a misinformation warning before users can access the content.

The LinkedIn page of lists its location as Moscow and its social media manager is a Moscow-based Dimtry Kukushkin. Another Russian, Alexander Malkevich, created the website and he works for Yevgeny Prigozhin, who has been indicted for interfering with U.S. elections. The FBI indictment says Prigozhim bankrolled Russia's Internet Research Agency, which has been accused of being a troll factory.

The other website that DivestTrump exposed last month,, left similarly easy breadcrumbs to follow.

In June 2017, a job listing was posted to a Russian career website with "" as the contact email address. The advertised position was for a front-end developer with English skills who would work from home with the occasional work trip to St. Petersburg.

"A lot of times [these sites] can look American. Sometimes they don't even try to cover their tracks, which was the case with -- they were registered, they were hosted, all out of St. Petersburg," said DivestTrump.

The website has since changed its host country to the United States, but an archived whois record shows that was originally registered in Russia.

After posting the now-viral thread to Reddit, exposing the efforts of and, DivestTrump told Point and Engadget that, and were behaving in very similar ways on Reddit.

South Front was previously flagged as suspicious by Jessikka Aro, a researcher at the Wilfried Maartens Centre for European Studies in Brussels. In an academic paper published in the journal European View, she cited as a pro-Kremlin domain. "South Front portrays itself as being a crowdsourced project, but it looks more like a professional info-war project run or backed by the Russian military," she wrote.

Other journalists have also named and shamed and

"Less than one percent of communities were responsible for 75 percent of antisocial behavior."

It's hard to prove beyond doubt that a website is part of a misinformation campaign directed from the Kremlin, but researchers have shown that it only take a small band of devotees to influence the balance of an online conversation.

Srijan Kumar, a postdoctoral researcher Stanford University, scrubbed Reddit's comments and carried out a data-driven analysis of conflicts on Reddit. He found that a few bad eggs can have a huge impact.

"Less than one percent of communities were responsible for 75 percent of antisocial behavior," he said.

Once a troll initiates the conflict by insulting a person or saying something extreme, they simply sit back and watch the rest of the thread do the work for them, explained Kumar.

"What a Russian troll or bot could do is essentially start these conflicts so that people get more engaged in the community and that would increase the visibility of the community and therefore increase the anger that they have stirred."

Gravwell conducted a sentiment analysis of the Reddit posts using these domains -- and there's no shortage of this kind of behavior.

"When we applied the machine learning from the training data, it's an overwhelmingly negative setup," said Gravwell co-founder Corey Thuen.

But sentiment analysis is not yet an exact science and Thuen said the results have probably over-estimated how negative the posts are, because some positive ones were incorrectly identified as negative.

But the data we do have, imperfect though it may be, backs up the pattern of behavior that Kumar from Stanford University has studied.

Point and Engadget reached out to Reddit for comment on this report, but received no response.

Reporter: Benjamin Plackett
Editors: Aaron Souppouris, Jay McGregor
Images: Reddit/Point

Video by Point
Narrator: Jay McGregor
Producers: Jay McGregor, Aaron Souppouris
Editor: Anton Novoselov