This flub comes just a week after Facebook's algorithm promoted a false story about Fox News anchor Megyn Kelly, which followed the company's decision to stop its human editors from writing descriptions about trending stories. Instead, Facebook's remaining editors just choose potential topics. That move followed controversy around potential bias from its human editors, which a Facebook investigation refuted.
"We're aware a hoax article showed up there," a Facebook spokesperson told the Washington Post, "and as a temporary step to resolving this we've removed the topic."
It makes sense that Facebook's human editors chose to highlight 9/11 as a trending topic, since there are plenty of stories popping up as we approach the 15th anniversary of the terrorist attack. But it's surprising they didn't pay extra attention to what, exactly, the algorithm surfaced for such a touchy subject. (Or perhaps it's in their best interest to prove that the algorithm isn't smart enough on its own.)