To the complete horror and amusement of those watching the grand experiment Facebook is doing on everyone, this week we found out the company is assigning a reputation score to users that ranks their trustworthiness. The perversity of the situation was lost on no one. (And no, it's not the kind of perversity we like; this is Facebook, after all, the anathema to human sexual expression.)
The company told Washington Post Tuesday that its ranking system is a new automated tool to aid in its effort to fight "fake news." This aspect of scoring people relies on factors around a person who uses the "fake news" reporting tool.
As in, the people using the flagging system Facebook announced in a December 2016 post.
Facebook assured the public in April 2017 that, because of its fabulous new flagging system, "overall that false news has decreased on Facebook" — but did not provide any proof. That was because "it's hard for us to measure because we can't read everything that gets posted." Then, in September 2017, Facebook told the press the "disputed" system worked and there was data to prove it; problem was, Facebook's fact-checkers told the same outlets that they had no idea if it worked and might be worsening the problem. By December 2017 Facebook killed the reporting tool's "disputed" flag and said it'd show people Related Articles instead.