Many people may have received their first exposure to the concept of “data scraping” from reading about Clearview AI, which was banned from several major social media platforms for harvesting pictures of users. But the technique has its research and academic applications as well. According to Facebook, intentions don’t matter; anything that violates its scraping policy is subject to removal. That’s what the social media giant has told a New York University research project that has equipped volunteers with a unique browser extension that captures information from political ads they see on the platform, as part of a large-scale study of ad targeting during election seasons.
Late last year, Facebook sent a letter to the NYU researchers warning about a violation of its data collection terms of service and asking for the project to be shut down and any collected data to be deleted. The researchers persisted despite threats of “further enforcement.” Now, nearly a year later, Facebook has terminated the accounts of the researchers and blocked their access to the platform. Facebook said that it was required to do this by the terms of a consent agreement with the Federal Trade Commission (FTC) in an unrelated 2019 case. However, the FTC quickly stepped in to say that Facebook was not in fact required to do this and called the company’s actions “misleading” and “disappointing.”
NYU ad targeting study caught up in broader Facebook legal issues
The story dates back to October 2020, when Facebook first sent the letter threatening “enforcement” to Laura Edelson and the team of NYU researchers at the Tandon School of Engineering. The project that Facebook objects to is the NYU Ad Observatory, a research effort involving about 6,500 volunteers kitted out with a custom browser extension made to capture the political ads that are shown to them as they use Facebook. The project seeks to gather more information about Facebook’s opaque ad targeting and how it chooses what to show people, a goal that could very much be argued to be in the public interest given the outsized influence Facebook advertising has been claimed to have on prior elections dating back to 2016.
FTC and others react to shut down of political ad targeting study
Facebook finally shut down the ad targeting research project in early August. However, the FTC was quick to follow up. Facebook made the broad claim that it was following its own policy and protecting the privacy of other platform users, but also cited a 2019 consent decree it had filed with the FTC as part of its $5 billion settlement over violation of a 2012 FTC privacy order. The FTC was quick to weigh in on this aspect of the dispute, saying that the 2019 decree required no such thing and it was a “misleading claim.”
The FTC posted a letter to Facebook CEO Mark Zuckerberg calling the claim “inaccurate” and referring to the company’s conduct as “disappointing.” Facebook was taken to task for not notifying the FTC in advance that it was going to cite the consent decree as a reason for shutting down the ad targeting research project. However, the letter essentially serves as nothing more than a public rebuke. The FTC said that it would not get involved in individual disputes between Facebook and third parties.
Facebook’s move against the ad targeting research also prompted response from at least one legislator. Senator Mark Warner (D-Virginia) tweeted about the incident, suggesting that online advertising should be subject to a greater degree of regulation.
The researchers have made the Ad Observer database available to academic researchers and journalists, which in turn has already informed several news stories about how political ads are funded and varying levels of activity and engagement along the political spectrum.
Facebook’s policy does not necessarily come from a place of censorship of reporting on its ad targeting technology; the company has good reason to be wary of large-scale scraping after the one-two punch of the Cambridge Analytica and Clearview AI scandals. Facebook does directly provide some public information about who pays for political advertisements and the locations of people who see them, but keeps its ad targeting technology under tight wraps for competitive reasons if nothing else. But the NYU researchers maintain that they were not in violation of the platform’s terms, that the research is vital and necessary for the public interest, and that Facebook has also now effectively disabled their similar Virality Project that tracks the spread of vaccine misinformation.