If a political campaign is an engine then propaganda is its oil, and its gas is the medium of communication with voters.
The hacking of the DNC and tonnes of raw crude in propaganda mined out through WikiLeaks, Breitbart and Daily Stormer was black gold for the Trump campaign.
We're now learning, thanks to the Facebook Files and the campaign's ability to connect with and inflame its racist, anti-immigrant base, that the accelerant was Facebook itself. Facebook's internal rulebook could be considered the Trump campaign's blueprint for its highly successful ad strategy's race-fueled messaging on the social network.
"Donald Trump is our first Facebook president," concludes The New York Review of Books, after spending considerable time with Prototype Politics: Technology-Intensive Campaigning and the Data of Democracy by Daniel Kreiss and Hacking the Electorate: How Campaigns Perceive Voters by Eitan D. Hersh.
"His team figured out how to use all the marketing tools of Facebook," NYRB wrote. "They understood that some numbers matter more than others—in this case the number of angry, largely rural, disenfranchised potential Trump voters—and that Facebook, especially, offered effective methods for pursuing and capturing them."
Indeed it did, because Facebook's own rules around speech and censorship appear to make the social media site a safe space for racists and terrorists in equal measure. We knew it wasn't a safe place for LGBT populations, vulnerable teens and certainly not for domestic violence victims or women who talk about human sexuality.
But until the publication of the Facebook Files by The Guardian this week did we have a firm grasp on the inverse question: Just who is Facebook safe for, anyway?
A clean, well-lit place for racists
Racists, revenge porn perpetrators, Holocaust deniers and people who think immigrants are filth: One look at the company's rulebook on content moderation reveals these upstanding members of Facebook's community get a free pass on cultivating hate. As long as they don't run afoul of a very slim, arguably subjective set of edge case rules, that is.
Holocaust denial is a particular piece of "free expression" that Facebook is keen to defend -- to the point of only enforcing its takedown in countries it thinks it might get sued in. The Guardian explains a Facebook training manual states it only hides or removes Holocaust denial content in four countries (France, Germany, Israel and Austria).
The manual says, "Some 14 countries have legislation on their books prohibiting the expression of claims that the volume of death and severity of the Holocaust is overestimated. Less than half the countries with these laws actually pursue it. We block on report only in those countries that actively pursue the issue with us."
When reached for comment, Facebook told us, "We currently prevent access to Holocaust denial content in the following countries: Italy, Spain, Netherlands, Belgium, France, Germany, Israel and Austria. We keep our policies under continual review and our policy and legal teams are currently looking at our obligations in respect of Holocaust denial."
The company's head of global policy management, Monika Bickert, added, "We recognize the importance and the sensitivities around the issue of Holocaust denial and have made sure that our reviewers are trained to be respectful of that sensitivity."
While it's not quite a "we're sorry you feel that way," we only hope that Facebook understands that Holocaust denial is less like a nut allergy and more like ignoring the facts of genocide for the express purpose of harm.
Bad press about this would be the only way to get Facebook to pretend to behave responsibly about this, but it's a loss out of the starting gate for a company that clearly doesn't get what's wrong with this in the first place.
And something is very deeply wrong here, on a company-culture level, for this to even be happening in the first place. Should Facebook respond to the bad press about its support of Holocaust denial under the flimsy rhetoric of "free speech," it will just be a target-specific response. Meaning, they'll fix one thing, while the bigger problems remain.
Those problems being its inability to grasp why it is such an excellent incubator for hate, and its willingness to fix things only after we're all totally fucked.
Fertile ground for Trump voters
The Guardian's Facebook Files are an abridged version of the company's content moderation policies, and they reveal an obvious forgiveness for the cultivation of hatred against immigrants, and thus, people of color. While Facebook is lightning fast to censor a gay man's post saying Trump supporters are "a nasty, fascistic lot," anyone characterizing immigrants as rapists or robbers gets a free pass (as long as they're not "equating" them with rapists or robbers).
Facebook's permissible statements include: "Islam is a religion of hate. Close the borders to immigrating Muslims until we figure out what the hell is going on"; "migrants are so filthy"; "migrants are thieves and robbers"; and "Mexican immigrants are freeloaders mooching off of tax dollars we don't even have." These are all statements in Facebook's rulebook marked as "ignore" when reported.
In the documents we learn that "All terrorists are Muslims" is okay to say on Facebook, but not "All Muslims are terrorists." The Guardian explains that this is because "terrorists are not a protected category, whereas Muslims are – which is why the first remark can be ignored and the second should be deleted if flagged."
Oh, okay. Because the people saying these things will totally get the difference, as they have all been educated about the way Facebook interprets the meanings of these things. In fact, the people saying these things won't see any difference by their very nature. According to the Southern Poverty Law Center, people who are swayed by the rhetoric of hate groups have two primary characteristics: They seek community and have a "low tolerance for ambiguity." Meaning, Facebook can split hairs all day on the subject of their sentences because for the behaviors of cultivating hate, it's all about the general context. Racists literally don't care about "grey areas."
In a way it doesn't matter anyway because Facebook's documents end in a shrug when it comes to things that are difficult to decide -- the default is always to leave it up. Moderators only have around ten seconds to make that decision anyway.
Facebook's Bickert told Engadget: "Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously."
For its part, Facebook told us, "Mark Zuckerberg recently announced that over the next year, we'll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly."
Facebook's moderation rules appear to have been created backward: They quibble over the company's beliefs about protected groups, while failing to grasp the bigger picture about what kind of environment is created by these kinds of misguided, made-in-a-bubble rules.
Rather than focus on creating a large collection of safe communities, the rules seem formed to fold in preexisting xenophobias and racism and give them Facebook's tools to thrive; one might call it "a clean, well-lit place for fascism." Meaning, the moderation rulemakers either really don't understand how hate groups form and grow, or they're fine with whatever as long as there's no bad press and everyone remains an active user. Gotta chase those ad dollars, yo.
Facebook has become a hate-group incubator; the company can't even wrap its head around the problem of fake news enough to prevent its own program to fight fake news from failing -- which it is. Not surprisingly, fake news is weighted by neo-Nazi, pro-Trump propaganda, though Facebook won't quite admit that key piece of information that could stop its spread. Instead, Facebook's weak excuse of a program to mark fake news as "disputed" is being seized upon and promoted, shared widely (wider than it likely would've) by alt-righties who marshal their sizable Facebook troops and disseminate it with cries of "censorship!"
A lot of noise was made in the press after the election about the Trump campaign's use of Cambridge Analytica and the way the skeezy data company exploited Facebook users (in addition to the 30 million Facebook users exploited for Trump by an unnamed data mining company). But The New York Review of Books tells us that Facebook was already packed with people ready to vote for Trump, and "Facebook's real influence came from the campaign's strategic and perfectly legal use of Facebook's suite of marketing tools."
Brad Parscale was the genius who put together Trump's Facebook outreach strategy. "A few weeks before the election," NYRB wrote, "he said he had a hunch from reading Breitbart, Reddit, Facebook, and other nontraditional news sources, and from the campaign's own surveys, that there were whole segments of the population -- people who were angry and disaffected -- that were being missed by traditional pollsters and the mainstream media."
And so he went to work during the primaries, purchasing $2 million in Facebook ads -- eventually ramping that up to $70 million a month, with most of it in Facebook ads. The New York Review of Books quotes Trump digital team member Gary Coby telling WIRED that, "On any given day...the campaign was running 40,000 to 50,000 variants of its ads ... On the day of the third presidential debate in October, the team ran 175,000 variations."
"He then uploaded all known Trump supporters into the Facebook advertising platform and, using a Facebook tool called Custom Audiences from Customer Lists, matched actual supporters with their virtual doppelgangers and then, using another Facebook tool, parsed them by race, ethnicity, gender, location, and other identities and affinities.
From there he used Facebook's Lookalike Audiences tool to find people with interests and qualities similar to those of his original cohort and developed ads based on those characteristics, which he tested using Facebook's Brand Lift surveys."
Unlike the Democrats, NYRB diagrams how Trump's team figured out how to use Facebook "to successfully sell a candidate that the majority of Americans did not want." They used Facebook's own tools, refined at targeting those most vulnerable to suggestion, to influence those ripening under Facebook's own rules that coddle Holocaust denial, and anti-immigrant, and anti-Muslim sentiment. This could be plainly seen in Trump's Facebook ads.
As NYRB explains, Trump's campaign "understood that some numbers matter more than others—in this case the number of angry, largely rural, disenfranchised potential Trump voters—and that Facebook, especially, offered effective methods for pursuing and capturing them."
There was a joke after the election that Facebook's motto "Move fast and break things" was better spoken as "Move fast and break democracy." But this is bigger than that.
In the United States, racist speech is considered free speech because it is opinion. Also in the US, hate groups are not illegal, but they are kept in check by a system of federal laws that monitor against hate crimes. Now, imagine that Uber has evolved to be allowed to determine what can and cannot be encouraged regarding communities dedicated to sexual harassment and abuse, and you can see why this might not work.
Facebook is not a country, yet it is assigning and removing rights about censorship and speech. It is not a civil rights organization, yet it decides that immigrants don't have protected status within its walls.
It's a company. One that is creating censorship tools so it can do business in China. It is a behemoth that complies with censorship demands from the governments of Thailand, Turkey, India, Israel, Pakistan and Vietnam.
It's also a company to which there is no alternative.
Images: REUTERS/Brendan McDermid (Protestors); REUTERS/Jim Young (Facebook, Zuckerberg)