Advertisement

Facebook has been giving misinformation researchers incomplete data

And it was discovered because of the transparency report it released in August.

Chip Somodevilla via Getty Images

Misinformation researchers who've been relying on the data Facebook provides them may have lost months or even years of work. That's because the social network has been giving them flawed and incomplete information on how users interact with posts and links on the website, according to The New York Times.

Facebook has been giving academics access to its data over the past couple of years to track the spread of misinformation on its platform. It promised researchers transparency and access to all user interaction, but the data the company has been giving them reportedly only includes interactions for about half of its users in the US. Further, most of the users whose interactions were included in the reports are the ones who engage with political posts enough to make their leanings clear.

In an email to researchers The Times saw, Facebook apologized for the "inconvenience [it] may have caused." The company also told them that it's fixing the issue, but that it could take weeks due to the sheer volume of data it has to process. Facebook told the researchers, though, that the data they received for users outside the US isn't inaccurate.

Facebook spokesperson Mavis Jones blamed the data inaccuracy to a "technical error," which the company is apparently "working swiftly to resolve." As The Times notes, it was University of Urbino associate professor Fabio Giglietto who first discovered the inaccuracy. Giglietto compared the data handed over to researchers with the "Widely Viewed Content Report" the social network published publicly in August and found that the results didn't match.

Other researchers raised concerns after that report was published. Alice Marwick, a researcher from the University of North Carolina, told Engadget that they couldn't verify those results, because they had no access to the data Facebook used. The company reportedly held a call with researchers on Friday to apologize. Megan Squire, one of those researchers, told The Times: "From a human point of view, there were 47 people on that call today and every single one of those projects is at risk, and some are completely destroyed."

Some researchers have been using their own tools to gather information for their research, but in at least one instance, Facebook cut off their access. In August, Facebook disabled the accounts associated with the NYU Ad Observatory project. The team used a browser extension to collect information on political ads, but the social network said it was "unauthorized scraping." At the time, Laura Edelson, the project's lead researcher, told Engadget that Facebook is silencing the team because its "work often calls attention to problems on its platform." Edelson added: "If this episode demonstrates anything it is that Facebook should not have veto power over who is allowed to study them."