The organization published 96 fact checks during this period. Of these, 59 were found to be false, 19 were a mix of truth and lies, seven were found to be opinion and six were judged as satire. Just five of the posts -- flagged by users concerned about their veracity -- were marked as true. The posts ran the gamut of current affairs, from misleading political information to false statements about vaccines. As The Times reports, much of this dubious information also came with a high health risk -- one post claims heart attack victims should cough "repeatedly and very vigorously" until help arrives. The British Heart Foundation has debunked this advice, and yet the post remains live on Facebook.
The problem appears to lie largely with Facebook's algorithms. Full Fact director Will Moy says that they "are not yet at a stage where they can reliably identify information that is inaccurate." Furthermore, information flagged by the algorithms and then confirmed as false by fact-checkers remains on the site, although its reach is reduced by more than 80 percent. According to Moy, Facebook has been reluctant to give the organization more details on the impact the fact-checking is having on false content -- a complaint upheld by other fact-checking groups such as Snopes, which has since quit the program with concerns over a lack of transparency.
The report also found that while Facebook has extended its fact-checking program to more countries and languages, it needs to scale of the volume of content and speed of response. However, Full Fact does note that the fact-checking initiative is "worthwhile" and it is likely that "something similar may be needed on other internet platforms too," ultimately concluding that "We want Facebook to share more data with fact checkers, so that we can better evaluate content we are checking and evaluate our impact."