Nearly every country is at least planning to initiate a coronavirus contact tracing program that makes use of mobile phones; some have had active programs since as early as late January. The Western democracies have lagged behind in getting these measures into place, due primarily to the inherently invasive nature of such programs butting up against privacy laws and public attitudes. But even in these countries, some national and local emergency laws have temporarily repealed certain freedoms and civil rights. Some of the world’s surveillance technology companies have leapt into this emerging market, pitching tools intended for law enforcement as pandemic countermeasures to be used against those who violate social distancing measures, quarantine orders and travel restrictions.
These sales pitches attempt to step around thorny issues of data privacy by focusing on those who are violating the more restrictive temporary measures. They essentially make the argument that if the subjects are either outside the law or subject to authoritarian measures (i.e. a quarantine order), then using surveillance technologies and phone hacking tools should not be a legal issue.
Surveillance technology companies taking advantage of market conditions
Tel Aviv-based surveillance technology firm Cellebrite is perhaps the highest-profile example. The company is best known for its mobile forensics products, and has been one of the world’s largest purveyors of tools to break encryption to law enforcement agencies for a little over a decade.
The company’s business dealings in this area have also riled privacy advocates on several occasions. In 2011, the Michigan ACLU drew international attention to the fact that the state police were using Cellebrite’s extraction tool in ways that could constitute unlawful search and seizure; over the past decade, other reports have surfaced indicating that law enforcement agencies all over the world may be using it in similar ways. A 2017 data breach also indicated that the company does business with just about any government agency that can afford to pay it, including authoritarian regimes that use the product to lean on political dissidents.
As of 2019, Cellebrite’s marketing has claimed to be able to crack the encryption on any iOS or Android phone that it can physically plug into. According to a company marketing email that was examined by Reuters in late April, Cellebrite is now pitching this technology to governments around the world as a tool to “quarantine the right people.” The implication is that it can be used to extract contact and movement information from the phones of people unwilling to voluntarily cooperate with government lockdown measures or contact tracing investigations.
The company is also no longer restricting its surveillance technology to law enforcement agencies. It has directly reached out to health care agencies as well, offering a more limited version of the product that cannot break into a confiscated device but is just as capable of quickly siphoning all of the personal data off of an unlocked phone.
Cellebrite does not make its client list public, but an investigation by Reuters confirmed that about a dozen countries in Asia, Europe and Latin America have purchased Cellebrite products for coronavirus containment purposes.
Another surveillance technology company that has successfully repackaged its technology as a coronavirus tracking system is NSO Group, creators of the controversial Pegasus spyware system. Pegasus has allegedly been used to track and spy on scientists and human rights activists in the UAE and Mexico, and was also connected to a campaign directed against journalist Jamal Khashoggi by Saudi Arabia authorities just prior to his killing.
Pegasus is a comprehensive spyware system that monitors everything on the target device once it is installed, from email to social media to VoIP. It is also capable of skimming WiFi passwords, location data and contacts. So how would this be used in coronavirus contact tracing? The government of Israel has contracted with NSO Group to build a nation-spanning surveillance technology platform meant to track the movements of coronavirus patients, but the project is currently on hold pending legal challenges.
Trust is the key element in successful coronavirus tracking
Any coronavirus tracking program conducted in a democratic country hinges on voluntary citizen acceptance and willful participation. Acceptance and participation are primarily driven by trust, both in the government and in the companies supplying the technology to track the virus. This has been one of the big barriers for Apple and Google in their joint coronavirus contact tracing program, and given that only a dozen countries have purchased these capabilities thus far it seems to be an even bigger barrier for the surveillance technology companies.
These companies have done little to inspire public trust. Their surveillance tools are proprietary and their dealings are business secrets, but when details leak out it always seems to involve either dealing with a questionable customer or the technology being put to a questionable use. This is a particular problem for surveillance companies such as NGO Group pitching a package that involves the use of cellular tower data, something more invasive (and more likely to conflict with national laws) than the Bluetooth-based plans currently under consideration in the US and throughout Europe.
A dozen countries in Asia, Europe and Latin America have purchased Cellebrite’s #surveillance products for coronavirus containment purposes. #privacy #respectdata
Click to Tweet
Another problem that has dogged the implementation of coronavirus tracing plans is the concern that the collected data will be stored and used for other purposes by tech companies after the pandemic is over. Tal Dilian, CEO of surveillance technology firm Intellexia, did not do anything to assuage those concerns when he stated in an interview with Reuters that he hoped governments would “adapt” his product for national security and mass surveillance once the public health crisis ends. That’s exactly the outcome that privacy advocates are hoping to avoid.