Advertisement

Can we put the fake news genie back in the bottle?

First and foremost, Facebook needs to take its role in disseminating news seriously.

Reuters/Carlo Allegri

The 2016 presidential campaign has definitively shown us that you shouldn't rely on Facebook for all of your news. Even at its best, you're likely to be exposed primarily to viewpoints and stories you already agree with. Being ensconced in a internet belief bubble takes away a lot of the nuance that exists in the real world -- while that lack of nuance likely helped Donald Trump become the next President of the United States.

But beyond the narrow viewpoint that comes from getting news through Facebook is a bigger problem: Fake news has been proliferating on the site at a rapid pace. In August, Facebook made some changes to its "trending news" section, removing human editors and replacing them with an algorithm. Ironically, the move seemed like a response to reports that those human editors were biased against conservative news. Without those editors patrolling the trending section, it became much easier for false stories to slip through. Indeed, a false report about Fox News anchor Megyn Kelly being fired spread like wildfire just days after the change was announced.

Things have gotten worse since then -- over the last week and a half, fake news on Facebook has been widely cited as a potential difference-maker in the elections results. And Facebook continues to duck its responsibility even as reports circulate about the company's dysfunctional process for flagging and removing false news stories. It's a topic that's inescapable right now -- Google Trends shows searches for "fake news" spiking in the last month.

And now President Obama has addressed the issue, speaking out against it in a press conference with German chancellor Angela Merkel as well as in an excellent, wide-ranging interview just published by the New Yorker. "If we are not serious about facts and what's true and what's not," he said yesterday, "and particularly in an age of social media when so many people are getting their information in sound bites and off their phones, if we can't discriminate between serious arguments and propaganda, then we have problems."

The fake news genie is most definitely out of the bottle -- so what do we do next? Alexios Mantzarlis, the head of Poynter's international fact-checking network, admits it's a big problem but thinks the first place to start is by distinguishing between fake news and misleading news and pushing back against the complete fabrications. "I've seen lists of fake news sites going that [incorrectly] included some alt-right sites," he says "I'm not suggesting that those sites are 100 percent accurate -- but there's a difference between a story that is misleading and a story that is outright fake. The battle that we can win is against 100 percent fake stories, so let's start there."

Mantzarlis mostly focused on what Facebook can do to stem the tide, a reasonable position given that about a quarter of the world's population uses the platform. "If there are humans at Facebook, they can fact check -- at least let's keep fake news out of trending," he says. "And if the team sees fake news, tag it so everyone knows. The Megyn Kelly hoax is shocking to me -- it kept getting engaged with after Facebook itself apologized for it being a hoax!" While Facebook says it got rid of the team working on trending news, the company still has humans reviewing posts that get flagged, as NPR reports.

While it sounds like coming up with standards that Facebook can quickly and consistently apply presents a major challenge, it's clearly something the company needs to get serious about. "[Mark] Zuckerberg has said that determining what is true is hard, and I agree with that," says Mantzarlis. "But when you have fake stories that have been yanked out of trending, they should clearly flagged [as fake]. If you don't want to delete content, at least annotate it -- don't let it go on more News Feeds afterwards!"

The challenge facing Facebook is walking the line between letting people share whatever they want with their friends -- what the platform was originally built for -- and cracking down on misinformation. "I'm aware of a strong backlash that is ready to surge as soon as Facebook pushes back too hard... they are walking a tightrope, and I wouldn't want to be them," Mantzarlis says. "They make decisions about how to curate the News Feed all the time -- if quality of the News Feed is important, I don't see why they can't be more particular about this stuff."

Unfortunately, with the ball so firmly in Facebook's court, we're going to be dependent on them to put the genie back in the bottle, and the question remains how interested the company will be in taking this challenge on. Zuckerberg is clearly irked by the suggestion that fake news shared on Facebook influenced the election. He claims that over 99 percent of content on the site is authentic. But despite his skepticism, the company has gone on record saying it would fight fake news -- though it didn't say how. Some aren't waiting to find out, as reports indicate dozens of employees are privately investigating how the company deals with such matters.

While Facebook bears a significant burden here, the problem extends beyond what it can control -- all the way to the White House, in fact. President-elect Donald Trump himself has long played fast and loose with the truth (to put it lightly), and some in the media are noting that they're essentially competing against his Twitter feed in an effort to tell the truth. It's a tendency that fake news maven Paul Horner successfully exploited throughout the campaign. "[Trump] just said whatever he wanted, and people believed everything, and when the things he said turned out not to be true, people didn't care because they'd already accepted it," Horner told The Washington Post. "His followers don't fact-check anything -- they'll post everything, believe anything."

Trump can crow about keeping a Ford planet from moving to Mexico, the media can investigate those claims and prove them false -- but at this point, the truth can quickly become secondary to the narrative Trump has created. His statements are out there, and his army of supporters won't accept an alternate viewpoint as the truth. It's no wonder the horrible phrase "post-truth" was the Oxford dictionary's word of the year.

So the war on fake news has really just begun -- but anyone using Facebook or otherwise reading the internet can do their part to hold sources accountable. Reporting bad stories on Facebook isn't a perfect solution, but it's what we can do for now. Venturing far outside of Facebook directly to reliable outlets that aren't algorithmically served to you is a smart idea, as well. If you're wondering what's real and what isn't, this document is a great place to start. Perhaps the most important thing you can do is read as many sources on a particular bit of news as you can: if one site claims Trump won the popular vote and 50 others say otherwise, common sense will tell you which one is lying.