Advertisement

Researchers shut down Instagram study following backlash from Facebook

The company says it had privacy concerns about the project.

Robert Galbraith / reuters

AlgorithmWatch, a group of researchers who had been studying how Instagram’s opaque algorithms function, say they were recently forced to halt their work over concerns Facebook planned to take legal action against them. In a post spotted by The Verge, AlgorithmWatch claims the company accused it of breaching Instagram’s terms of service and said it would move to take “more formal engagement” if the project did not “resolve” the issue.

AlgorithmWatch’s research centered around a browser plugin more than 1,500 individuals downloaded. The tool helped the team to collect information it says allowed it to make some inferences about how Instagram prioritizes specific photos and videos over others.

Most notably, the team found the platform encourages people to show skin. Before publishing its findings, AlgorithmWatch said it had reached out to Facebook for comment, only for the company not to respond initially. However, in May 2020, Facebook told the researchers their work was “flawed in a number of ways” after it said earlier in the year it found a list of issues with the methodology AlgorithmWatch had employed.

When Facebook accused AlgorithmWatch of breaching its terms of service, the company pointed to a section of its rules that prohibits automated data collection. It also said the system violated GDPR, the European Union’s data privacy law. “We only collected data related to content that Facebook displayed to the volunteers who installed the add-on,” AlgorithmWatch said. “In other words, users of the plugin [were] only accessing their own feed, and sharing it with us for research purposes.” As for Facebook’s allegations related to GDPR, the group said, “a cursory look at the source code, which we open-sourced, shows that such data was deleted immediately when arriving at our server.”

Despite the belief they had done nothing wrong, the researchers eventually decided to shutter the project. “Ultimately, an organization the size of AlgorithmWatch cannot risk going to court against a company valued at one trillion dollars,” they said.

When Engadget reached out to Facebook for comment on the situation, the company denied it had threatened to sue the researchers. Here’s the full text of what it had to say:

We believe in independent research into our platform and have worked hard to allow many groups to do it, including AlgorithmWatch — but just not at the expense of anyone’s privacy. We had concerns with their practices, which is why we contacted them multiple times so they could come into compliance with our terms and continue their research, as we routinely do with other research groups when we identify similar concerns. We did not threaten to sue them. The signatories of this letter believe in transparency — and so do we. We collaborate with hundreds of research groups to enable the study of important topics, including by providing data sets and access to APIs, and recently published information explaining how our systems work and why you see what you see on our platform. We intend to keep working with independent researchers, but in ways that don’t put people’s data or privacy at risk.

This episode with AlgorithmWatch has worrisome parallels with actions Facebook took earlier in the month against a project called NYU Ad Observatory, which had been studying how political advertisers target their ads. Facebook has some tools in place to assist researchers in their work, but for the most part, its platforms have been a black box since the fallout of the Cambridge Analytica scandal. That’s a significant problem, as AlgorithmWatch points out.

“Large platforms play an oversized, and largely unknown, role in society, from identity-building to voting choices,” it said. “Only if we understand how our public sphere is influenced by their algorithmic choices, can we take measures towards ensuring they do not undermine individuals’ autonomy, freedom, and the collective good.”