Advertisement

How hateful alt-right trolls hijacked your timeline

Trump’s belligerent troll-bots are really great, really really great, the greatest.

You don't need to get attacked by a pro-Trump troll-bot horde to know that social media is a battleground for propaganda farms. It's pretty obvious, and miles of speculative digital ink has been spilled saying as much. An Oxford study this week is getting more of that ink spilled, confirming what we already knew. But no one's spelled out what it actually means.

The Computational Propaganda Research Project at the Oxford Internet Institute, University of Oxford, certainly tried. That's the paper everyone's talking about this week, by the way. It looked at case studies from researchers in nine countries, interviewed 65 experts, and analyzed tens of millions of posts across seven different social media platforms during moments of heightened government propaganda activity: elections, political crises and national-security incidents.

Not surprisingly, the paper found that "computational propaganda flourished during the 2016 US presidential election." Tell us Americans that and we'll remind you that bears make fecal deposits in the woods. We know, we knew, we saw it coming a mile away (but had no idea how to stop it). The same was true during the 2016 UK Brexit referendum, where political bots played a strategic role in shaping Twitter conversations and keeping pro-Brexit hashtags dominant.

The paper noted these incidents and a few more. It found that automated posting accounts, combined with fake news and troll armies and harassment campaigns, have reimagined the art and practice of authoritarian soft power in the 21st century.

Our "Facebook president"

The researchers wrote that Facebook plays a critical role in grooming young minds with political ideology because companies "such as Facebook, are effectively monopoly platforms for public life."

Add Facebook advertising to the computational propaganda mix, and you've got a mind-blowing toolset for emotionally manipulating people -- without their knowledge -- into believing, saying and fighting for whatever you want.

The Oxford paper concluded that "computational propaganda is now one of the most powerful tools against democracy."

One thing we've learned in the past few years is that the core messages of political propaganda on social media are driven by humans. Their job is to cover up for people in power, motivate and empower harassment, and make us too discouraged to do anything about their wrongdoings. In case you're wondering, the people at the bottom of the propaganda chain know exactly what they're doing.

Some love their jobs, others do not. In 2015, one of Russia's professional trolls went to press detailing her role in making people think the murder of Russian opposition leader Boris Nemtsov was at the hands of his own friends rather than by government hitmen, as is widely suspected. "I was so upset that I almost gave myself away," Lyudmila Savchuk said to the press.

The paid pro-government trolls work in rooms of 20; it was reported in 2015 that their numbers are in the thousands, making posts and comments all day, every day. Upon leaving, Savchuk said her goal of going to press with documentation, including video, was to get it closed down," she told The Telegraph. "These people are using propaganda to destroy objectivity and make people doubt the motives of any civil protest. Worst of all, they're doing it by pretending to be us, the citizens of Russia."

Another ex-propaganda troll, Marat Burkkhard, was assigned to spreading racist memes about public figures like President Obama. It's enough to make one wonder more about America's rise in open racism online. "The most unpleasant was when we had to humiliate Obama, comparing him with a monkey, using words like darkie, insulting the president of a big country," he said.

"I wrote it, I had to." Saying he quit for his own sanity, he added, "if every day you are feeding on hate, it eats away at your soul." He also noted that in his particular propaganda factory, his office seemed split 50-50 in how everyone felt about what they were doing: Half were racist patriots, and the rest were just in it for the money.

That was all before the US election, and what became known as Trump team's super-obvious social-media influence campaigns.

The new golden age of propaganda began much earlier than Brexit or 2016's American presidential disaster. Last year, Leo Benedictus revealed that troll political armies could be had for the right price in a range of countries that included Russia, Israel, Ukraine, UK, North Korea, South Korea, and Turkey. He wrote, "Long before Donald Trump met Twitter, Russia was famous for its troll factories –- outside Russia, anyway." He explained:

Allegations of covert propagandists invading chatrooms go back as far as 2003, and in 2012 the Kremlin-backed youth movement Nashi was revealed to be paying people to comment on blogs. However most of what we know now comes from a series of leaks in 2013 and 2014, most concerning a St Petersburg company called Internet Research Agency, then just "Internet Research." It is believed to be one of several firms where trolls are trained and paid to smear Putin's opponents both at home and internationally.

OK, so we get that troll armies and their bots do propaganda stuff to make politicians look bad. But what happens when they go after regular people? Or, like in the US now, end up with an entire resistance movement?

We get a clear picture by looking at what Russia's government did to its resistance during the country's 2011-2012 elections for president and Duma (its lower house of parliament). Just a couple of months before this week's Oxford paper came out, a more instructive study on social-media propaganda was published, called Communication Power Struggles on Social Media: A Case Study of the 2011–12 Russian Protests.

When people started to mobilize and place calls to action on social media and blogs, Putin's patriotic hackers DDoS'd every site possible, including LiveJournal, where the government was already running its posting and commenting campaigns. Those they couldn't disable with traffic overload, like Twitter, they attacked with other means.

How? By manipulating people's perceptions and emotions about the resistance, according to the paper. "Our analysis suggests that, in particular, the Russian government successfully used Twitter to affect perceptions of the oppositional movement's success and legitimacy," the researchers wrote.

This included "diminishing and discrediting the resistance," (like insisting on low turnout numbers for protests) but also by "exaggerating, enthusing, and claiming broad public support" for pro-government ... well, everything. They also elevated -- through creating an appearance of popularity -- certain players to be spokespeople for the propaganda topics of the day.

Finally, they created a culture of fear that encouraged people to self-censor.

"Spiral of silence"

The researchers noted how support began on Twitter for anti-corruption and anti-Putin resistance in December 2011, but that widespread delegitimization for the movement (as well as belittling), and visibility of pro-Putin messages shifted that conversation by January 2012. In addition, "Critical voices were discredited and political elites were represented as legitimate."

The Russian regime's anti-resistance messaging made it seem "indisputable that Putin enjoyed broad support among Russians," and so "the protest movement began to dissolve quickly." The paper said:

Our analysis highlights that the growing feeling of futility and disillusionment affecting the oppositional movement more broadly was clearly reflected on Twitter in the weeks leading up to the presidential election. With the political discourse on Twitter beginning to noticeably shift in favor of the Putin supporters, oppositionally minded people on Twitter may have started to slide into a so-called "spiral of silence."

They perceived their political view to be in a shrinking minority, finding insufficient resonance in the discourse on Twitter, and gradually stopped to speak up, turning rather inward in growing self-doubts and disillusion.

They also distributed their messages well, reaching tons of people -- which is social-media advertising's core promise, we should note. I think now we're starting to see exactly why Facebook's emotional manipulation activities are a threat to democracy in line with the Oxford study's conclusion about computational propaganda.

In the 2011 example, the Russian government, with all its resources, was far more effective at influencing people on Twitter than those who dared question the people in power.

In conclusion, the researchers wrote:

In the end, no matter how much "real" support Putin had, our analysis of the political discourse suggests that the perceived support had a real effect on the opposition and general public on Twitter. This shows that regardless of the promises that new digital technologies hold in terms of empowerment of marginalized or weaker (political) actors, these technologies are still part of the overall system of power—in particular, uneven resource distributions—and may therefore still be utilized by governments in their favor.

In other words, our study empirically confirms that indeed "whoever has enough money, including political leaders, will have a better chance of operating the switch in its favor.

It looks like a blueprint for what's happening on American Twitter day and night right now. Though compared to Russia's successful 2011 resistance suppression, Trump's trolls and botmasters are pretty bad at winning hearts and minds. Maybe that's partly why social-media propaganda is looking likely to get folded into the Mueller probe.

In any case, the new golden age of propaganda is here. The companies whose structures it thrives on, in all its hideousness and viciousness, are loath to change their business models to stop it. The illness is not our fault, though that's what they hope to convince us of, in this, our new futuristic system of oppression.

Just don't let the fact that it looks like Idiocracy make you take it any less seriously.

Image: OLGA MALTSEVA/AFP/Getty Images (Lyudmila Savchuk)