Why you can trust us

Engadget has been testing and reviewing consumer tech since 2004. Our stories may include affiliate links; if you buy something through a link, we may earn a commission. Read more about how we evaluate products.

Facebook and Instagram reveal content ‘recommendation guidelines’

The guidelines are Facebook’s internal rulebook for recommended content.

NurPhoto via Getty Images

Facebook and Instagram are trying to peel back the curtain on what has long been one of the least understood parts of its platform: how it recommends content users don’t already follow. Today, the company published its “recommendation guidelines” for both Facebook and Instagram.

The guidelines are essentially Facebook’s internal rulebook for determining what type of content is “eligible” to appear prominently in the app, such as in Instagram’s Explore section or in Facebook’s recommendations for groups or events. The suggestions are algorithmically generated and have been a source of speculation and scrutiny.

Notably, the guidelines shared today don’t shed much light on how Facebook determines its recommendations. In a statement, Facebook’s Guy Rosen notes suggestions are personalized “based on content you’ve expressed interest in and actions you take on our apps,” but doesn’t offer specifics. What the guidelines do detail is the type of content Facebook blocks from recommendations throughout its platform.

Specifically, the posts list five categories of content that “may not be eligible for recommendations.” This includes borderline content that doesn’t break the company rules, but that Facebook considers objectionable, such as “pictures of people in see-through clothing;” spammy content, like clickbait; posts “associated with low-quality publishing;” and posts that have been debunked by fact checkers.

Though the rules themselves aren’t new — Facebook says it’s been using the guidelines since 2016 — it’s the first time the company has made these policies visible to users.

It may also help Facebook address criticism as the social network has come under increasing scrutiny for its algorithmically generated recommendations. The suggestions have been widely criticized for leading people to conspiracy theories or extremist content they may not otherwise go searching for. People who follow anti-vaccine pages on Instagram, for example, may also see recommendations for QAnon accounts and conspiracy theories about COVID-19. (Facebook’s guidelines confirm that vaccine misinformation and QAnon are both considered ineligible for recommendations.)

At the same time, users have long accused Facebook of censorship and “shadow bans,” the idea that the company hides some content for real or perceived infractions. By opening up these guidelines, it will at least be more clear why not all posts make it onto Instagram’s Explore section, for example.