Face recognition and personal identification technologies in street surveillance cameras showing report on Clearview AI violation of privacy laws

Canada’s Privacy Commissioner Rules That Clearview AI Facial Recognition Software Violates Privacy Laws, Must Delete Biometrics From Its Database

Already unwelcome in at least one state and a number of cities in the US, Clearview AI has been shown the door in Canada as well. The controversial facial recognition company’s practices have been deemed to violate the country’s privacy laws, and the privacy commissioner has requested that it delete the faces of Canadians from its databases.

Clearview AI was rapidly taken up by law enforcement agencies around the world in recent years before it was discovered that it was scraping social media sites to build its massive database of facial images, often in violation of the site terms of service. The controversy caused some local governments to place moratoriums and bans on its use, and the company voluntarily pulled out of the state of Illinois in 2020 after it was found to be in violation of state privacy laws.

Clearview AI’s scraping violates consent laws

Privacy Commissioner Daniel Therrien said at a news conference that Clearview AI may not make use of the facial images of Canadians without their explicit consent. Therrien characterized Clearview AI’s scraping of websites and social media platforms for public pictures as “mass surveillance” and illegal under the terms of the country’s collection of privacy laws.

The Canadian government does not have the authority to order the New York-based company to delete the images of Canadians, but Clearview AI opted to withdraw from the country entirely as the investigation was playing out. A report from the commissioner’s office notes that the facial recognition giant was considering remaining out of the Canadian market for two years while the country develops relevant guidelines. Formal action toward the company has thus far been limited to sending a “letter of intention” directing the company to stop scraping the facial images of Canadians and deleting any images already collected.

However, this decision may keep Clearview AI out of Canada’s government agencies in the future. Prior to the investigation it was used by dozens of organizations across the country, including the Royal Canadian Mounted Police (RCMP).

Clearview AI issued a statement in response that reflects its basic defense in other similar cases elsewhere in the world: “Clearview AI only collects public information from the Internet which is explicitly permitted … Clearview AI is a search engine that collects public data just as much larger companies do, including Google, which is permitted to operate in Canada.” The company is estimated to have scraped some three billion images from public-facing websites and social media profiles to build the face-matching service that is widely used by law enforcement agencies. Clearview’s specific argument in Canada was that a section of the Personal Information Protection and Electronic Documents Act gave specific permission to use publicly available information in this way, an assertion that the privacy commission directly rejected in its decision.

Clearview AI announced that it plans to challenge the decision in court. It will keep its facial recognition services out of the Canadian market for the foreseeable future, but has also said that it will not remove Canadians from its database unless they send the company a photo (its standard policy in other markets). In the Illinois case, Clearview AI removed the faces of state residents (in addition to withdrawing from the state) under the threat of hefty fines.

Canadian privacy laws evolve to address facial recognition and scraping

This decision may serve as a precedent to help firm up Canadian privacy laws and the handling of biometric information in the country.

As part of their counterargument Clearview AI asserted that there would be no material harm to Canadians due to the use of these scraped facial images. The final report disagreed, noting that false flags and incorrect matches were a possibility in addition to privacy invasion. Some individuals with an unfortunate resemblance to certain wrongdoers might find themselves to be regular participants in a police lineup.

The report also noted that the decision applies to parts of Canada that are not necessarily subject to the terms of the relevant privacy laws, namely the provinces of Quebec, Alberta and British Columbia. And it rejected Clearview’s assertion that these privacy laws should not apply to it as it does not have a “real and substantial connection” to Canada as a whole.

A related class action lawsuit has been proposed in the country’s Federal Court. It seeks damages from Clearview AI for all Canadians that were swept up into its facial recognition technologies without knowledge or consent, arguing that the company engaged in illegal mass surveillance in violation of Canadian privacy laws.

The head of the RCMP is also expected to be called before the House of Commons ethics committee at some point to explain the agency’s use of the surveillance software. The RCMP has previously praised Clearview AI’s software in public statements, saying that it is primarily used for cases of online sexual exploitation of children and has been a great help in that area.