Man using facial recognition technology on mobile phone showing the privacy concerns over facial recognition technologies
Facial Recognition Technologies: Time to Face the Music? by Jennifer Baker, EU Policy Correspondent

Facial Recognition Technologies: Time to Face the Music?

Authorities could consider a stricter approach to the use of facial recognition technologies.

This month, a school in northern Sweden triggered the first fine for use of a facial recognition system under Europe’s General Data Protection Regulation (GDPR).

The Swedish data protection authority (DPA) issued a fine of 200,000 Swedish Krona (approximately €18.500) after the use of facial recognition technology to monitor the attendance of students in school was deemed to breach privacy laws.

The test run was conducted on a class of 22 students for several months last year.

The school in Skellefteå said it processed the sensitive biometric data with the students’ consent, but according to the European Data Protection Board (EDPB) such consent was “not a valid legal basis given the clear imbalance between the data subject and the controller.”

The municipality also failed to carry out adequate impact assessment and properly notify Sweden’s Data Protection Authority about the pilot program.

The news comes after the UK Information Commissioner’s Office (ICO) announced it was opening an investigation into the use of facial recognition software at the busy station area of King’s Cross in London.

The Information Commissioner, Elizabeth Denham, also questioned the use of live facial recognition (LFR) technology by South Wales Police and the Met Police. “We understand the purpose is to catch criminals. But these trials also represent the widespread processing of biometric data of thousands of people as they go about their daily lives. And that is a potential threat to privacy that should concern us all,” she said.

“Any organisation using software that can recognise a face amongst a crowd then scan large databases of people to check for a match in a matter of seconds, is processing personal data. I believe that there needs to be demonstrable evidence that the technology is necessary, proportionate and effective considering the invasiveness of LFR, ” she continued.

Courts, police authorities and data protection authorities in the UK and elsewhere are currently looking into what sort of safeguards are necessary – a process that seems to be happening too little, too late. Vast databases already exist.

Although consent is the generally most relied upon basis for processing facial biometric data, but as seen in the Sweden case that “consent” must be informed and freely given in order to be GDPR compliant.

In July the EDPB published its guidelines on processing of personal data through video devices noting that “the use of biometric data, and in particular facial recognition, entail heightened risks for data subjects’ rights.”

Helpfully the guidelines contain several real world examples of how data protection law applies. For example: “A shop owner has installed facial recognition system inside his shop in order to customise its advertisement towards individuals. The data controller has to obtain the explicit and informed consent of all data subjects before using this biometric system and delivering tailored advertisement. The system would be unlawful if it captures visitors or passers-by who have not consented to the creation of their biometric template, even if their template is deleted within the shortest possible period. Indeed, these temporary templates constitute biometric data processed in order to uniquely identify a person who may not want to receive targeted advertisement.”

Another concern identified by the EDPB is that much facial recognition technology has not quite reached maturity and far from identifying people may well be mis-identifying individuals.

“In addition to privacy issues, there are also risks related to possible malfunctions of these devices and the biases they may induce. Researchers report that software used for facial identification, recognition, or analysis performs differently based on the age, gender, and ethnicity of the person it’s identifying. Algorithms would perform based on different demographics, thus, bias in facial recognition threatens to reinforce the prejudices of society. That is why, data controllers must also ensure that biometric adopted data processing deriving from video surveillance be subject to regular assessment of its relevance and sufficiency of guarantees provided,” read the guidelines.

The EDPB is accepting feedback on its guidelines until 9 September, and may make revisions. Meanwhile the Financial Times reports that the European Commission is considering regulating the use of the technology, but to date the GDPR remains the gold standard – at least in Europe.

Meanwhile in China, facial recognition is widely used and Megvii, the maker of the Face++ system, a leading Chinese facial recognition provider has filed for listing on the Hong Kong stock exchange just days after protesters destroyed a “smart lamppost” over fears they were being surveilled.

Swedish school fined under #GDPR for using #facialrecognition technology to monitor the attendance of students even though consent was given. #respectdataClick to Tweet

Around the world less scrupulous regimes deploy facial recognition for surveillance and suppression. Despite this, facial recognition technologies are on hardly any export control lists. These lists limit the export of so-called dual-use technologies – tech that can be used for good or ill – to despotic regimes. That may change as the US and the EU are rumoured to be considering the matter.