When Facebook announced it was rolling out a major overhaul to its News Feed earlier this month, it did so with the intention of prioritizing interactions between people over content from publishers. It was a notable shift in strategy for the company, which for the past couple of years had been working closely with news outlets to on heavily promote their articles and videos. But, Facebook discovered that people just weren't happy on the site -- likely due to the vast amount of political flame-throwing they've been exposed to since the 2016 US Presidential election. So in order to alleviate this problem, it decided it was best if users saw more posts from friends and family, instead of news that could have a negative effect on their emotions. Because keeping people both happy and informed is, apparently, hard.
Unfortunately, Facebook's solution to this problem doesn't seem to be the best one. Last week, it said its plan is to only put front and center links from outlets that users deem to be "trustworthy." Which just proves that Facebook would rather put the responsibility for policing misinformation on the community instead of itself. This is concerning because Facebook is, essentially, letting people's biases dictate how outlets are perceived by its algorithms.
The full Facebook news trustworthiness survey.
— Alex Kantrowitz (@Kantrowitz) January 23, 2018
It its entirety. https://t.co/bd0qkkXGgN pic.twitter.com/oUvTZLNiyB
As reported by BuzzFeed News, Facebook has a "trusted" news source survey that consists of two simple questions: "Do you recognize the following websites?" and "How much do you trust each of these domains?" For the first one, the answers you can provide are a simple "yes" or "no," while the latter gives you the options to reply with "entirely," "a lot," "somewhat," "barely" or "not at all." That doesn't seem like the best or most thorough way to judge editorial integrity. Not only that, but the survey doesn't take into account personal biases, leaving the system wide open to abuse. This is only going to encourage people to continue to live in a bubble of their own creation, where there's no room for information or opinions that challenge their worldview.
Facebook seems to think the benefit to its survey is that it's simple and straightforward, though that's actually why it's so misguided. There's no room for nuance or additional context. How many people will say they don't trust the New York Times or CNN simply because President Donald Trump calls them "Fake News" any chance he gets? Sure, those particular outlets shouldn't have any problem being recognized as legit, but even labeling them as such doesn't seem like it's a responsibility Facebook's willing to take on. A Facebook spokesperson said to Engadget that Facebook is a platform for people to "gain access to an ideologically diverse set of views," adding the following:
We surveyed a vast, broadly representative range of people (which helps -- among other measures -- to prevent the gaming-of-the-system or abuse issue you noted) within our Facebook community to develop the roadmap to these changes -- changes that are not intended to directly impact any specific groups of publishers based on their size or ideological leanings.
Instead, we are making a change so that people can have more from their favorite sources and more from trusted sources. I'd also add that this is one of many signals that go into News Feed ranking. We do not plan to release individual publishers' trust scores because they represent an incomplete picture of how each story's position in each person's feed is determined.
"There's too much sensationalism, misinformation and polarization in the world today," Zuckerberg said in Facebook post announcing the News Feed changes. "Social media enables people to spread information faster than ever before, and if we don't specifically tackle these problems, then we end up amplifying them. That's why it's important that News Feed promotes high quality news that helps build a sense of common ground."
The problem is that, by letting its users control how the system works, Facebook may actually end up amplifying the fake news bubble it helped create. Facebook users were instrumental in the spreading of misinformation and Russian propaganda during the 2016 US Presidential election. According to its own data over 125 million Americans had been exposed to Kremlin-sponsored pages on Facebook.
"The hard question we've struggled with is how to decide what news sources are broadly trusted in a world with so much division," Zuckerberg added. He said that Facebook could try to make that decision itself, but that "that's not something we're comfortable with."