Advertisement

Facebook didn't stop fake news because it's afraid of conservatives

Update to the social network would have disproportionately removed right wing-appealing stories.

Stephen Lam / REUTERS

In the last week, Facebook's been battling the accusation that fake, often inflammatory stories showing up in users' Newsfeeds influenced how people voted in the presidential election. The social media giant vowed it is currently taking the issue seriously and is searching for an unspecified solution even as CEO Mark Zuckerberg personally defended Facebook, claiming that over 99% of stories on the network are authentic and that it was "extremely unlikely" that fake news impacted the election outcome. But within Facebook, a fierce debate has allegedly roiled since May about whether to install an update that curbs fake and hoax news -- but they didn't deploy it because stories from conservative news sources were disproportionately downgraded and removed from users' Newsfeeds.

To be clear, there's not much known about the update's efficacy to accurately scrub fake news. But ultimately, it was shelved and buried, sources told Gizmodo. One said that, "They absolutely have the tools to shut down fake news," but that product decisions (i.e. whether to install the update) were affected by fear of offending conservative readers even further after a mini-scandal six months ago.

Back in May, Facebook's Trending Topics section got flak for how much its curation team "routinely suppressed" stories of interest to conservative readers. This likely contributed to the decision to pink slip the team and ditch human control of trending entirely in favor of a supposedly impartial algorithm in August. That incident sunk company mood, according to a New York Times report released last weekend: "The Trending Topics episode paralyzed Facebook's willingness to make any serious changes to its products that might compromise the perception of its objectivity."

Without their human editors, hoax stories blossomed across the social network. Fake reports circulated that Fox News anchor Megyn Kelly had been fired just after the Trending Topics team was let go, while a 9/11 tabloid conspiracy story rose to the top of the algorithm-controlled Trending Topics just prior to the anniversary.

But it was election-related fake news that have raised concerns in the last few days. Posts like "FBI Agent Suspected in Hillary Email Leaks Found Dead in Apparent Murder-Suicide" or "Pope Francis Shocks World, Endorses Donald Trump for President, Releases Statement," points out Gizmodo, were circulating leading up to the election. In his November 12th post addressing the issue, Zuckerberg rejected the premise that those kinds of stories affected voters: Not only do they account for less than 1% of content passing around Facebook, he said, but hoax posts showed up on both sides of the aisle -- and many don't involve politics at all.

Even relying on napkin math, that's still almost 1% of stories in front of Facebook's 1.79 billion users. And since 44% of US adults use the social network as a news source according to a Pew survey, that's still a chunk of Americans who kept seeing the fake news, and however many that trusted the authenticity of stories on Facebook to keep circulating the stories. Considering how many states' electoral votes hinged on less than one percent of votes swinging one way or another, the marginal influence of fake news isn't something to dismiss.

Update: A Facebook spokesperson shared this statement with Engadget over email, responding to Gizmodo's story:

The article's allegation is not true. We did not build and withhold any News Feed changes based on their potential impact on any one political party. We always work to make News Feed more meaningful and informative, and that includes examining the quality and accuracy of items shared, such as clickbait, spam and hoaxes. Mark himself said "I want to do everything I can to make sure our teams uphold the integrity of our products." This includes continuously review updates to make sure we are not exhibiting unconscious bias.