Advertisement

Facebook’s approach to fighting fake news is half-hearted

Rather than quitting cold turkey, it’s just cutting back.

Earlier this week, Facebook hosted a group of reporters (myself included) at its NYC office for a Q&A session about its efforts to fight fake news. The event, led by Head of News Feed John Hegeman and News Feed product specialist Sara Su, began with Facebook showing us a short film called Facing Facts. It's a documentary that debuted last May, which tells the story of the company's uphill battle to rid its site of a misinformation plague that seems incurable. For months, Facebook has talked about how hard it is working to fix the issue (by hiring third-party fact-checkers, removing fake accounts and more), but on Wednesday it left us with more questions than answers. That's because Facebook believes reducing and flagging fake news stories is better than removing them altogether, and that doesn't seem like the best approach.

If Facebook wants to get serious about solving the spread of false information, particularly from publishers, it needs to take a stand and ban it completely. Both Hageman and Su said Facebook needs to be a platform for all ideas, and that it needs to protect free speech, therefore it can't choose sides. Thing is, taking down stories or pages that promote conspiracies that have been debunked or hoaxes isn't about taking political sides -- it's about acknowledging that there are facts and there are lies, and protecting people from the latter. Facebook believes "people can decide for themselves what to read, trust, and share." And that, by flagging stories as false or lowering their ranking in the News Feed, they're helping people make that decision. But that's simply refusing to take responsibility for its own role in spreading the disease of fake news.

When asked by CNN reporter Oliver Darcy why the company allows InfoWars, an outlet that traffics in dangerous and frankly insane conspiracy theories, to have have a page and promote stories on Facebook, Hegeman said it was because the site has "not violated something that would result in them being taken down." He added, "being false... doesn't violate the community standards." Again, Facebook's argument here is that, if people see a story from InfoWars that may be false (like the one about the Democrats planning a second Civil War), it's better to let them decide for themselves if they want to read it or not. Yes, if a piece of content is deemed as "false" by one of its fact-checkers it will be labeled as such, but why give it an audience at all?

The challenge for Facebook is that while it may be able to keep false stories out of its News Feed, it would be virtually impossible (and possibly illegal) to keep people from sharing links to them with their friends. A Facebook spokesperson told Engadget that Facebook can only remove things that violate its Community Standards and "we don't have a policy that stipulates that everything posted to Facebook must be true. You can imagine how hard that would be to enforce." That said, she added, fake news offenders do lose monetization and advertising privileges, which "dramatically" reduces the distribution of their content on Facebook.

During the screening of Facing Facts, there was a scene which showed a big sign on a wall at a Facebook office that read "With connection comes responsibility," a message the company keeps repeating in its fight against fake news. Yet, Facebook also says things like, certain stories don't violate its Community Standards "just for being false." Or, "We don't have a policy that stipulates that everything posted to on Facebook must be true," which should concern every one of Facebook's 2.2 billion users -- regardless of political views.

"There are strong views on both sides of this debate. But we've decided to demote news marked as fake by third-party fact-checkers rather than removing it entirely," the Facebook spokesperson said. "Once demoted, in our experience a piece of content gets 80 percent. fewer views going forward. We know people disagree but given the important free speech issues at stake we don't remove this content entirely."

Facebook is right that this isn't easy to solve, but no one said it would be. And if it wants people to think its serious about fixing its fake news problem, it's going to need to do better. Because claiming that the best thing to do is be neutral to the truth won't do the company, or its users, any good.