Popular midscale department store chain Macy’s is facing a privacy lawsuit over its alleged use of the controversial facial recognition software sold by Clearview AI. The lawsuit is being brought in the northern district of Illinois, where a unique biometric privacy law requires clear written consent to be provided before any identifying markers are collected.
Though Clearview AI is not named in the lawsuit, this is not the first time the upstart surveillance company’s name has been invoked in relation to violations of the Illinois Biometric Information Privacy Act. The company has claimed that it only provides its software to law enforcement agencies, but a February 2020 leak of its client list revealed that it was working in some capacity with a number of major retailers including Macy’s.
Another Illinois privacy lawsuit featuring Clearview AI
The privacy lawsuit was filed by a Chicago woman who is a regular shopper at Macy’s and claims that the store is using facial recognition technology with its in-store security cameras. The basis for this claim appears to be the Clearview AI leak from earlier this year. The proposed class action suit alleges that Macy’s is profiting off of the use of this technology in terms of its use for marketing purposes and security; class action status has yet to be approved by Chicago’s federal court. The privacy lawsuit is seeking $1,000 for each incident of negligent violation and $5,000 for each intentional violation, and would also require Macy’s to delete any stored biometric identifiers and cease use of the facial recognition service.
Reporting by Buzzfeed indicates that Macy’s was among the Clearview AI clients that created an account with the surveillance company and executed at least one search. While some of these retail and banking clients were only offered a 30-day free trial and ran only a handful of searches, Macy’s appears to have signed on for a longer period. Buzzfeed reports that it was one of the non-government clients that ran the largest number of searches for facial images, at least 6,000 over an unspecified period of time.
Macy’s refused to comment on its relationship with Clearview AI to the Chicago Tribune and other media outlets, citing a policy of not speaking about pending litigation.
Clearview AI’s legal problems (both inside and outside of Illinois) stem from its practice of populating its facial recognition database with public images scraped from various social media services. This was often in violation of the terms of service of these platforms, something that prompted Facebook and Apple among others to demand that Clearview AI stop earlier this year. In May the ACLU initiated a similar suit against the beleaguered surveillance company in Illinois. In total, Clearview AI is estimated to have scraped some three billion images from social media and other public sources for its surveillance database — and has not notified the subjects of any of this.
Illinois: The new biometric privacy battleground?
Companies that operate nationally are increasingly finding themselves subjected to privacy lawsuits in Illinois. The state’s level of protection of biometric data is unique in the US, with consent requirements and ability to bring civil suits that exceeds the scope of even the California Consumer Privacy Act.
Facebook, which has attempted to bar Clearview AI from mining its resources, was itself subjected to a privacy lawsuit in Illinois earlier this year. The company agreed to settle for $550 million in a case that involved its tagging of the names of state residents in photos. In July, a state federal judge rejected the settlement terms given that the expected payout to each aggrieved party would only end up being about $150.
A number of other major companies have been subject to privacy lawsuits in the state since the legislation was enacted in 2008. Photo book company Shutterfly was sued in 2017 over automated scanning of uploaded facial images, as was Google over similar alleged scanning of pictures uploaded to the Google Photos service (though the suit was ultimately dismissed in late 2018). The Six Flags amusement park chain was also sued last year over collection of the thumbprints of employees to use with the company’s punch clock system.
The Illinois biometric privacy law requires that clear written consent be collected from the data subject before any biometric markers are taken. It also sets requirements for safe storage of this data and establishing a retention schedule and guidelines for removal. A number of other state laws contain elements similar to these, but none has the full scope of protections that Illinois offers. While civil remedy is only available to residents of the state who bring similar privacy lawsuits, the existence of the law has forced American tech companies that operate on a national scale to consider it as part of their more general operational planning.