Advertisement

Facebook should know by now what's news and what's spam

The social network is too big -- and too important -- to have a screwy news algorithm.

Getty Images

Late last week, The Guardian published an interview with a survivor of Obama's first drone strike, which occurred in tribal Pakistan on his third day as president. It detailed the impact the attack had and raised concerns over the civilian damage these drone strikes can cause. But as important as this story sounds, you would not have been able to share it on Facebook. If you tried to do so, Facebook would have blocked you.

Spencer Ackerman, the story's author, learned about it from friends who tried to share the link and couldn't. He took to Facebook to voice his disappointment. The piece, after all, contained nothing inherently offensive, without any graphic imagery or incendiary language. After his editors informed Facebook about the block, he was told that it was an error. It turns out that the link was somehow marked as spam by Facebook's automated anti-spam system. The story has since been cleared of that false positive and can now be shared. Ackerman, for his part, has told us he believes it was an honest mistake.

But this is not the first time an innocuous news story has been flagged unfairly. In December last year, for example, a New York Times article about 1950s nuclear targets was blocked with a message that read, "The content you're trying to share includes a link that our security systems detected to be unsafe." In November, Facebook also initially blocked Boing Boing and Tech News Today stories about a Facebook rival called Tsu.co. Those stories were marked as spam. (Tsu is an incentive-based social network that pays its users for sharing and generating content, which Facebook says encourages spammy behavior.)

Facebook

These stories were eventually approved, and the ban lifted. But the fact that completely benign links can be marked as false positives at all is worrisome. For many people -- almost 1.59 billion users, at last count -- Facebook is their predominant window to the world. It's the modern equivalent of a web portal, not unlike MSN, Yahoo or even AOL, Engadget's parent company. Facebook, for its part, doesn't seem to shy away from this pronouncement. In May of last year, it partnered with several news sites like the New York Times and BuzzFeed to host editorial content from those sites on Facebook's own servers. Ostensibly, it's to improve page load times, but it also keeps you, the user, within Facebook's walled garden. And walled gardens are no good if they keep you from reading and consuming outside links.

You might point out that these issues were all resolved in the end, but what about the stories we don't hear about? What about news links from smaller blogs or independent websites that don't have the same clout or reach as the Guardian or the Times? What if a legitimate news story gets blocked and nobody reports it? We might never know about it.

Of course, it's not really Facebook's fault, either. A social network of its size attracts a slew of spammers and folks who wish to flood the network with bad links. Sometimes spammers embed their links behind URL shorteners or attach them to an image to hide detection. In an explainer on its spam prevention system posted in 2010 (which a Facebook spokesperson claims is still relevant today), Facebook said that it devotes a tremendous amount of time and resources to build systems that "detect suspicious activity and automatically warn people about inappropriate behavior or links." It uses a combination of anti-spam tools, engineer intervention (they can write rules in real time to identify malicious content) and community reporting to filter the bad stuff out.

But sometimes legitimate stuff still gets hit with the ban hammer. In that same post from 2010, then company spokesperson Matt Hicks wrote:

"Every once in a while, though, people misunderstand one of these systems. They incorrectly believe that Facebook is restricting speech because we've blocked them from posting a specific link or from sending a message to someone who is not a friend. Over the years, these misunderstandings have caused us to be wrongly accused of issues ranging from stifling criticism of director Roman Polanski over his sexual abuse charges to curbing support for ending U.S. travel restrictions on Cuba to blocking opponents of same-sex marriage."

It's unfortunate, then, that Facebook isn't more forthcoming about why and how it blocks certain links. Facebook said in the above post that it won't share details regarding how its anti-spam algorithm works, because otherwise spammers might learn to game the system. Indeed, we asked Facebook to comment on what happened with Ackerman's Guardian story, and the company simply pointed us to a comment left on his post where a Facebook spokesperson said it was a false positive. But it's still a disconcerting feeling to know that a link might be blocked for no obvious reason beyond that it was marked as spam. That would be a good enough excuse if Facebook were just for communications between friends and family. But when it's also a daily news source for a billion-plus people, it's not an excuse at all.

Some of these so-called false positives could be averted if Facebook took its role as news disseminator more seriously. Perhaps it could be more like Apple News, which combines the usual algorithmic news feeds along with links curated by actual human beings. This would be right in line with its status as modern-day web portal -- MSN, AOL and Yahoo all have full-time editors who curate their homepages. In fact, Facebook did at one point hire editors to curate news: It was for its Paper news-feed app, before that transitioned into what eventually became Instant Articles.


To be fair, Facebook's News Feed is different from that of Apple News or even Twitter Moments, in that it's based almost entirely on who your friends and family are. Your news feed is based on an algorithm that combines stories you tend to "Like" and the kinds of posts Facebook thinks will get the most engagement. In a way, your news feed is already curated, but by a machine, not by a person. For Facebook to hire editors to curate personalized news feeds for all 1.59 billion of its users might be asking too much.

And yet, why can't it have both? A combination of human news curation along with Facebook's powerful news-feed algorithm could send a strong message to its users that Facebook really is their one-stop shop for all that's happening in the world. And, more important, perhaps having real people monitoring the news would prevent legitimate stories -- like the ones Ackerman wrote -- from going unseen.

[Image credit: Top, middle: Getty Images; bottom: Facebook]