Latest in Culture

Image credit: Associated Press

Facebook denies filtering conservative news stories

The company claims it has neutrality guidelines in place to prevent this sort of thing.
22 Shares
Share
Tweet
Share
Save

Sponsored Links

Associated Press

Even if your Facebook News Feed is full of family members dropping racist memes or links to factually inaccurate articles, you might not see such showing up in the "trending news" portion of the social network's landing page. And there's a reason for that: Workers "routinely suppressed" news stories that'd interest conservative users from the section, according to a report from Gizmodo. Those stories apparently include anything about the Conservative Political Action Conference, two-time Republican presidential hopeful Mitt Romney and posts from conservative news outlet The Drudge Report.

More than that, it appears Facebook wouldn't curate a story with conservative origin (Breitbart, for example) unless it was picked up by The New York Times or BBC first. While Facebook's company line is that it "takes allegations of bias very seriously" in light of the Gizmodo report, claiming "rigorous guidelines" to ensure consistency and neutrality and that those guidelines don't "permit the suppression of political perspectives," the sources for these allegations were contract workers -- not full-on employees themselves. These contractractors worked for Facebook from the middle of 2014 until December 2015.

What appears in the Trending News module isn't exclusively determined by an algorithm of what its users are actively sharing, it's curated much like how an editorial newsroom operates. One of Gizmodo's sources -- who leans politically conservative -- says that what would populate the list was largely determined by who was working at the time. If that person happened to not subscribe to conservative points of view, a story would be blacklisted. More than that, if a particular story is trending on Twitter but not Facebook? It's "injected" into the Trending News section. Specific instances of that include the Black Lives Matter conversation or the ongoing conflict in Syria.

This isn't the first time Facebook has come under fire for this type of thing. In 2014 the company admitted that it controversially, and experimentally, altered the News Feed to measure your emotional responses.

Update: Tom Stocky, Facebook's Vice President of Search, has posted the following statement refuting the Gizmodo report:

Via: TechCrunch
Source: Gizmodo
Coverage: The Verge
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.
Comment
Comments
Share
22 Shares
Share
Tweet
Share
Save

Popular on Engadget

Engadget's Guide to Privacy

Engadget's Guide to Privacy

View
YouTube CEO apologizes for channel verification mess (updated)

YouTube CEO apologizes for channel verification mess (updated)

View
Apple’s new iPhones can better manage your battery as it ages

Apple’s new iPhones can better manage your battery as it ages

View
Porsche welcomes challenge from Tesla as it adapts to the EV world

Porsche welcomes challenge from Tesla as it adapts to the EV world

View
Fujifilm's X-Pro3 mimics film cameras with a fold-out display

Fujifilm's X-Pro3 mimics film cameras with a fold-out display

View

From around the web

Page 1Page 1ear iconeye iconFill 23text filevr