Amazon’s facial recognition suite Rekognition has been embroiled in controversy since it debuted in 2016. It started with Amazon’s very active marketing of the service to United States police departments, which raised serious civil rights and privacy concerns. The May data breach of border screening contractor Perceptics, along with the very recent breach of fingerprint and facial recognition platform BioStar 2, has added major security issues to that mix.
A new announcement regarding the service is likely going to exacerbate all of those concerns. A few days ago, Amazon announced that they had added a new emotion to the list that Rekognition is able to read: fear.
Amazon’s facial recognition capability
Fear is actually the eight psychological state that Amazon claims to be able to read with its facial recognition software. The company had previously announced that it could detect sad, angry, surprised, disgusted, calm and confused facial images as well as a general age range. Competing companies such as Microsoft and Google offer similar analysis features, but none have claimed to be able to detect fear as of yet.
If it works as advertised, this new capability is particularly concerning given how aggressively Amazon has tried to sell this software to law enforcement agencies in recent years. There is particular concern about it being sold to the federal government, given that U.S. Customs and Border Protection has been seeking quotes for an overhaul of the border security system which would include expanded use of facial recognition technology.
Worries along these lines are speculative, but important to consider before there is wholesale adoption of this sort of new technology. For example, one possible implementation would be real-time scanning of the emotions of citizens during interactions with the police. The officer might use a reading of fear or aggression to escalate the situation to a higher level of force. Emotion scanning from pictures or a video might also color an approach to serving a search warrant on or apprehending a suspect.
While government use is the most prominent of the concerns about this technology, it is not the only one. Private enterprise might also begin using it in an invasive way. For example, Amazon is marketing the service to retailers as a means to scan customer faces in real-time to gauge their reactions to in-store experiences. And security teams might use emotion detection to make assumptions about situations or patron’s intentions and engage with them on private property. This technology may even be put to use in job interviews or negotiations to gauge the interest level of the other party.
How accurate is emotion detection software?
All of this ultimately hinges on the software operating with a high degree of accuracy. Amazon themselves do not appear to be fully confident in that, including a disclaimer in their Rekognition developer guide that states “It is not a determination of the person’s internal emotional state and should not be used in such a way” and acknowledging that the software can be defeated by a person who is intentionally making a particular type of face.
The software basically only matches known facial positions corresponding to emotions, which are something that can be manipulated by the subject. Some people may just have “one of those faces” and constantly scan as “angry” or “sad” even when they aren’t feeling those things. It does not so much detect emotions as it does link certain facial expressions with certain emotions.
The science linking facial positions to emotions is hardly settled, however. A July 2019 literature review found that the available scientific evidence reveals relatively little about “how and why certain facial movements express instances of emotion” and that “less actionable guidance” is available to companies like Amazon than is commonly assumed.
The limit of accuracy for emotion detection at this point is thus quite questionable. Compounding the reliability issue is that the accuracy of even the facial position recognition appears to drop the darker a person’s skin is. Tests run on existing facial recognition software used by Customs and Border Protection found that black women’s faces were incorrectly matched to their passport photos 10 times more often than the faces of white women.
The debate over facial recognition
It is unknown exactly how many law enforcement agencies use Amazon’s facial recognition services, as the technology is not currently regulated and there is no real requirement to disclose the use of it. Only two agencies had made their use of it public prior to this year – the police departments of Hillsboro, OR and Orlando, FL. Orlando dumped the facial analysis service last month due to public pushback and concerns about rates of false positives.
Similar concerns are causing politicians to call for pre-emptive facial recognition bans. At the federal level, representatives from both sides of the aisle have expressed concerns about how artificial intelligence technology like Rekognition might ultimately end up infringing on civil liberties. Democratic presidential candidate Bernie Sanders has called for a ban, and Republican representative Jim Jordan has expressed support for a moratorium while the technology is evaluated more carefully. Several cities have taken it upon themselves to ban facial recognition technology already, San Francisco the most prominent among them. The American Civil Liberties Union also formally objected to Amazon Rekognition specifically, running a test in which over 20 members of Congress were falsely matched with criminals.
The debate over privacy and potential civil liberties violations ultimately boils down to trust – trust in tech companies like Amazon to deliver an extremely accurate facial recognition product that can be relied upon, and trust in the agencies that use it to not overstep their bounds and draw erroneous or hasty conclusions. Both of these factions have experienced issues that make it difficult for many to trust them in that way. Improved accuracy for emotion detection systems is really just one small part of inspiring public support for them.