Facebook has repeatedly come under fire over the last year as a distributor of "fake news", despite repeated its protests that it isn't a media company per se. The social network has taken a number of steps to push back against the influx of falsehoods, from hand curating articles to rejiggering its news surfacing algorithms, though none have done much to stem the tide so much as give Facebook something to crow about. On Thursday, the company continued that trend by introducing an "educational tool" that will live at the top of the newsfeed and provide tips on how to spot false reports.
The company teamed with the non-profit First Draft to develop the tool, which is being rolled out on a temporary basis to users in 14 countries. When someone clicks on it, the tool will redirect them to the Facebook Help Center and display information "including tips on how to spot false news, such as checking the URL of the site, investigating the source and looking for other reports on the topic," according to a Facebook Blog post.
This is the latest in a series of rather weak attempts taken by Facebook to eliminate fake news from its network. The company claims in its blog post that "False news runs counter to our mission to connect people with the stories they find meaningful" and that "we know we have more work to do" yet it continually does the absolute bare minimum in response. This latest tool, like the "false news" reporting feature that debuted in December, is completely passive and relies exclusively on the user to take initiative.
It's the same issue we saw with Facebook's illegal gun sale debacle last summer: the company creates an environment that facilitates illegal and toxic activities, then throws its hands up when called on their actions and declares that it's its users' responsibility to clean up and police themselves.
The company has previously demonstrated that it is capable of applying technological solutions to the fake news problem. It has tested filtering in both Germany and France and tweaked its algorithms to shuffle untrustworthy articles lower in the news feed. Yet it continues to insist that its users take the lead.
Facebook has also announced that it is thinking about hiring third party fact-checkers to police its news as well. "A commercial relationship is something that's on the table and that we are very open to," Adam Mosseri, Facebook vice-president of product management for newsfeed, told The Financial Times. "It could depend on individual organisations, but we want to engage responsibly and if that means a financial arrangement, we are very open to it." Again, Facebook is sidestepping its responsibility to manage its network and is instead simply alluding to the possibility of taking proactive action in the future.