Advertisement

Facebook: If your feed is an echo chamber, you need more friends

Facebook wants you to know that you've only got yourself to blame for the lack of diversity in views on your News Feed. The social network has recently conducted a study to find out why people mostly see posts that mirror their own beliefs and to find out if a "filter bubble" is to blame. "Filter bubble" is what you call the situation wherein a website's algorithm shows only posts based on what you clicked (or Liked) and commented on. For this particular study, the company used anonymous data from 10.1 million Facebook users who list their political affiliations on their profiles. Researchers monitored "hard news" links posted on the website and looked at whether they were posted by conservatives, liberals or moderates.

The result? According to a blog post on Facebook research, which details the contents of the study (emphasis ours):

"While News Feed surfaces content that is slightly more aligned with an individual's own ideology (based on that person's actions on Facebook), who they friend and what content they click on are more consequential than the News Feed ranking in terms of how much diverse content they encounter."

The study admits that the filter bubble effect is real -- in varying degrees, based on political affiliation -- but it claims the website's algorithms don't play that big of a part. Who you're friends with apparently has a more profound effect on your News Feed, with the study pointing out that "birds of a feather flock together:"

Friends are more likely to be similar in age, educational attainment, occupation and geography. It is not surprising to find that the same holds true for political affiliation on Facebook.

However, Eli Pariser, who once gave a TED talk on the perils of the filter bubble, warns that the study might be downplaying the effects of the Facebook algorithm. "Certainly, who your friends are matters a lot in social media," he writes in his response to the study on Medium. "But the fact that the algorithm's narrowing effect is nearly as strong as our own avoidance of views we disagree with suggests that it's actually a pretty big deal."

Pariser isn't the study's only critic either: Christian Sandvig from Social Media Collective argues that there's a very small percentage of Facebook users that volunteer "interpretable ideological affiliations" on their profiles, which is one of the requirements to be part of the research. He writes: "We would expect that a small minority who publicly identifies an interpretable political orientation to be very likely to behave quite differently than the average person with respect to consuming ideological political news." Sandvig also finds the way the study was framed to be questionable, almost as if it was written as an alibi: "Facebook is saying: It's not our fault! You do it too!"

As you can see, the study's become quite controversial. If you want to come up with your own conclusions, you can pore over the study on Facebook Research for a more thorough look at the results, and see even more details in the paper published in Science.