The US Senate is about to receive a report detailing Russia's online manipulation attempts during and after the 2016 Presidential election, and it promises to shed new light on the subject... including a lack of evidence from the internet giants themselves. The research, conducted by both Oxford University and network analysis company Graphika, outlines some familiar strategies. The Russians "clearly sought to benefit" the Republicans, rallying support for Trump on social networks while trying to "confuse, distract and ultimately discourage" Trump's opponents. It also notes that Russia's digital influence plans started with Twitter, but quickly expanded to Instagram and YouTube -- Facebook actually came last. They also tried smaller social networks like Google+, Pinterest and Tumblr (owned by Engadget parent company Verizon) as well as email.
The report unsurprisingly notes that internet companies had a "belated and uncoordinated" response to Russia's meddling campaign, in some cases pulling accounts, offering tools and launching war rooms well after the election was over. However, the investigators also blasted firms for providing incomplete or difficult-to-study data. Facebook gave the Senate info on some Russia-linked accounts and posts but not others, according to the report Twitter and YouTube, meanwhile, made it difficult to scrutinize their info -- for YouTube, the researchers had to hunt down links to videos on other sites to gauge the scope of the service's role.
The Russians also made rookie mistakes that could have been used to spot their activities earlier, such as buying ads with rubles and leaving signatures in logs that pointed to a Russian base of operations.
The Senate report should be public within the next several days. While the findings aren't completely shocking, they could be influential as politicians consider how to study the 2018 midterms and prepare for 2020. They also suggest that attitudes toward social networks need to change if they haven't already. Where these sites were previously seen as forces for good, they're increasingly being exploited as a "computational tool for social control" both in democracies and autocracies.