Advertisement

Facebook disables accounts of NYU team looking into political ad targeting

The NYU Ad Observatory project violated its terms against scraping, Facebook said.

Facebook disables accounts of NYU team looking into political ad targeting

Before the US election last year, a team of researchers from New York University's engineering school launched a project to gather more data on political ads. In particular, the team wanted to know how political advertisers choose the demographic their ads target and don't target. Shortly after the project called the NYU Ad Observatory went live, however, Facebook notified the researchers that their efforts violate its terms of service related to bulk data collection. Now, the social network has announced that it has "disabled the accounts, apps, Pages and platform access associated with NYU's Ad Observatory Project and its operators..."

The researchers created a browser extension to collect data on the political ads the website shows the thousands of people who volunteered to be part of the initiative. Facebook says, however, that the plug-in was made to avoid its detection system and calls what it can do "unauthorized scraping." The extension "scrape[d] data such as usernames, ads, links to user profiles and 'Why am I seeing this ad?' information," Facebook wrote in its announcement. It also said that the extension collected data about Facebook users who didn't install it and didn't consent to take part in the project.

The company wrote that it made "repeated attempts to bring [the team's] research into compliance with [its] Terms." That apparently included inviting the researchers to access its US 2020 Elections ad targeting data through FORT’s Researcher Platform. Facebook said the data set on the platform could offer more comprehensive information than what the extension can collect, but the researchers declined its invitation.

As The Wall Street Journal mentioned in its report last year, Facebook has an archive of advertisements on its platform, which includes data on who paid for an ad, when it ran and the location of the people who saw it. However, it doesn't contain targeting information, such as how it's determined who sees the ad. On its website, Ad Observer researchers wrote: "We think it's important to democracy to be able to check who is trying to influence the public and how."

Facebook is adamant that it disabled the project's access to its platform because it knowingly violated the website's terms against scraping. It blocked the team's access to its platform, it said, in order to "stop unauthorized scraping and protect people's privacy in line with [its] privacy program." After the Cambridge Analytica scandal, it agreed to an updated policy with the FTC, which pushed the social network to limit third-party access to its data. We asked the Ad Observer team for a statement and will update this post if we hear back.

Update 8/04/21 7PM ET: Laura Edelson, the lead researcher behind NYU Cybersecurity for Democracy, which operates Ad Observatory, told Engadget in a statement:

"This evening, Facebook suspended my Facebook account and the accounts of several people associated with Cybersecurity for Democracy, our team at NYU. This has the effect of cutting off our access to Facebook's Ad Library data, as well as Crowdtangle. Over the last several years, we’ve used this access to uncover systemic flaws in the Facebook Ad Library, to identify misinformation in political ads including many sowing distrust in our election system, and to study Facebook’s apparent amplification of partisan misinformation. By suspending our accounts, Facebook has effectively ended all this work. Facebook has also effectively cut off access to more than two dozen other researchers and journalists who get access to Facebook data through our project, including our work measuring vaccine misinformation with the Virality Project and many other partners who rely on our data.

The work our team does to make data about disinformation on Facebook transparent is vital to a healthy internet and a healthy democracy. Facebook is silencing us because our work often calls attention to problems on its platform. Worst of all, Facebook is using user privacy, a core belief that we have always put first in our work, as a pretext for doing this. If this episode demonstrates anything it is that Facebook should not have veto power over who is allowed to study them."

This article contains affiliate links; if you click such a link and make a purchase, we may earn a commission.