Facebook has published a new report on the state of influence operations on its platform. The report, which covers the period between 2017 and 2020, sheds new light on the company’s efforts to prevent election interference, and how attempts to manipulate its platform have evolved.
The report notes influence campaigns have changed a lot since 2016, when Russia’s Internet Research Agency used fake accounts . While Facebook is still uncovering IRA activity, its tactics are changing. For example, last year Facebook and Twitter uncovered an IRA scheme that involved US-based journalists who were into authoring articles for a fake news site meant to prop up their influence campaigns. (Both Facebook and Twitter said at the time that the fake accounts were found before they could reach a large audience.)
“Threat actors basically faced an empty field in 2016, and the world is very different today,” Facebook’s Head of Security Policy, Nathaniel Gleicher, said during a call with reporters. “But also… there are more actors who are using these techniques today than was the case in 2016.”
Another major difference between now and 2016 is that “inauthentic behavior” is more frequently coming from within the country being targeted, not just from foreign actors. In the report, Facebook notes that it down an equal number of CIB (coordinated inauthentic behavior) networks from Russia, Iran and the US itself.
Of those originating in the US, “more than half were campaigns operated by conspiratorial and fringe political actors that used fake accounts to amplify their views and to make them appear more popular than they were.”
Also complicating matters: that these networks often relied on real people to spread their message or run fake accounts. And though the report doesn’t specifically name Donald Trump, it notes that the “then-US President” was among those “promoting false information amplified by IO [influence operations] from various countries including Russia and Iran.”
Of course, Facebook dealt with numerous other issues surrounding the election besides fake accounts. The company was QAnon and other extremists prior to the election, and these groups were able to spread conspiracy theories relatively unchecked for months. Following the election, Facebook the “Stop the Steal” movement, which fueled the violence on Jan. 6. An internal report from Facebook suggested that the company’s focus on looking for fake accounts may have blinded it to the dangers posed by legitimate accounts spreading conspiracy theories.
“The US 2020 election campaign brought to the forefront the complexity of separating bad actors behind covert influence operations from unwitting people they co-opt or domestic influencers whose interests may align with threat actors,” the report says.