Blur view of people wearing masks and walking in the street showing mass surveillance and automated decision-making

Automated Decision-Making Helped to Reduce the Negative Impact of the Pandemic, But Also to Normalize Mass Surveillance

Automated decision-making technology helped to save lives during the Covid-19 pandemic in a number of different ways. But a new report from AlgorithmWatch also contends that the pandemic was used as an excuse to normalize various elements of mass surveillance, with some of these having no real impact on Covid-19 response at all.

The pandemic was used as an excuse to ram through a number of measures that were not democratically debated and were installed without adequate safeguards and public transparency, the watchdog agency warns. The year-long study identified Covid contact tracing apps, digital Covid certificates and drones equipped with thermometers as tools that have been abused in some cases.

Covid used to usher in new forms of mass surveillance

The AlgorithmWatch report is centered on the organization’s year-long “Tracing the Tracers” project, a platform that continually added automated decision-making (ADM) systems to a database (with the assistance of public flagging of new ones as they appeared). The project’s goal was to track the social impact of these systems on privacy, human rights, fairness and social justice.

The report notes that the use of these ADM tools was frequently debated, but ” … not in an  evidence-based fashion and mostly based on contradictory, faulty, and incomparable methods.” The tools were also often sold to the public on ” … unfounded promises and marketing-oriented hype.”

This is not to say that ADM tools were all snake oil across the board. The report notes that some made positive contributions to a wide range of public health outcomes, such as making travel safer and helping to prioritize the delivery of vaccines to those most at risk from the virus. However, the overall impact is qualified as “mixed” and not ” … central, fundamental, or even necessary to effectively respond to the pandemic.”

The report singles out digital contact tracing apps and domestic digital Covid certificate schemes (such as the “Green Pass” concept first developed in Israel and then brought to Europe) as areas where results were questionable given how polarizing the concepts were and how much “hype” and “propaganda” was introduced with the process of integrating them into daily public life.

Most of the “dystopian” implementations of ADM (those that verged on mass surveillance) were found outside of Europe, according to the report. However, it did find that some ADM tools implemented in Europe violated European Union law and international human rights law. One example of a controversial European ADM measure was a brief trial of placing facial recognition cameras in public areas in France to detect whether passers-by were wearing masks in subway stations, something shut down after a short period when the French data protection authority warned that the program was likely in violation of General Data Protection Regulation (GDPR) terms that safeguard against indiscriminate mass surveillance.

Though a number of these tools were individually worrying in their implications for privacy and fundamental rights, the report points out that much of the “normalization” of mass surveillance came from the secondary effects of lockdowns and social distancing rules. School and work shifted to the home, and employers and administrators attempted to replicate means of monitoring and discipline via tools with surveillance elements that peer directly into people’s private spaces. Some examples were the emergence of the “online proctoring” industry for at-home test taking, and employee monitoring tools (which saw a 500% spike in monthly use during the pandemic period).

How to reap the benefits of ADM without overstepping into mass surveillance

The report’s suggestions for curtailing mass surveillance and improving ADM systems include improved transparency, an end to “vague and undefined” exceptions to established laws and protections, clear time limits on measures that show potential to break out into long-term surveillance programs, and not being baited into an “AI arms race” (particularly with China).

The report also has a particular warning about the use of biometrics in ADM systems. Biometric identification in the name of security is usually something that is met with fierce debate and resistance. However, the pandemic presented an opportunity to steamroll these concerns in the name of public safety. The report points to the Australian quarantine system as one example, which asks subjects to check in from their home location via app three times per day; several states in the country have added biometric facial verification to this system.

In addition to the concerns about mass surveillance, ADM systems have also had documented problems with leaks and data breaches. One example is a bug in the Google/Apple joint contact tracing protocol that was taken up by multiple countries throughout the world, something that was fortunately discovered by an ethical cybersecurity student before it could be exploited to interrupt the Bluetooth transmissions the system relies on.