A few weeks ago a group of U.S. Republican senators introduced the Lawful Access to Encrypted Data Act, a bill that would compel American tech companies to put backdoors in their products to give law enforcement access to customers’ data with a warrant.
If passed, the Act will put everyone’s data at risk, while reversing decades of work to make encryption stronger and personal data more secure.
Even for purposes of lawful access, no company wants government pressure to insert vulnerabilities in their products. Adding any sort of hidden backdoor access or decryption capability potentially jeopardizes a company’s reputation and its business prospects.
Yet such pressure is reality in the tech industry. In 2016, for example, Apple famously refused an FBI request to unlock an iPhone linked to a shooting in California.
But it’s not just the U.S. government pushing companies to install backdoors. In 2018, a letter was issued by the Five Eyes intelligence-sharing alliance, consisting of the U.S., U.K., Canada, Australia and New Zealand. The letter warned that if private companies refused to help authorities de-code encrypted emails, text messages, and voice communications, the Five Eyes governments might “pursue technological, enforcement, legislative or other measures to achieve lawful access solutions.”
Tech companies have faced such pressure to compromise privacy for decades. In the 1990s, the FBI floated the concept of “key escrow,” whereby software developers would copy the keys to every algorithm and give them to authorities upon request. In 2017, then–U.S. Deputy Attorney General Rod Rosenstein claimed that encrypted apps were protecting criminals like terrorists and drug dealers. Rosenstein acknowledged that “the approach taken in the recent past – negotiating with technology companies and hoping that they eventually will assist law enforcement out of a sense of civic duty – is unlikely to work.”
Why would that be? Do technology companies lack a sense of civic duty? A more likely explanation is that they know vulnerabilities are inevitably discovered and exploited. This turns their products into the cyber-equivalent of a suitcase lock made for the Transportation Security Administration. TSA authorities have master keys that can open any TSA-approved lock, allowing the agency to open your luggage without breaking the lock and damaging your suitcase.
As many people predicted when the program was rolled out, this vulnerability was soon exploited by hackers, allowing anyone with a 3D printer copy of the master keys and sell them on the black market. The same thing could easily happen with data networks.
As the Internet of Things ushers in an era of cyber-physical systems, network security is becoming a matter of life and death: instead of just disclosing your credit card information, a system breach could highjack the car you’re driving or stop the pacemaker that controls your heart.
Think about the legal liability, reputational damage, and potential loss of life. What company wants any part of that?
Its sponsors in Congress will argue that the Lawful Access to Encrypted Data Act will keep America’s networks safe. Sadly, the effect will be the exact opposite. And tech companies know it.
Just as no organization wants to get hacked, no company in the world wants to install backdoors in their technology. It’s just bad for business, as products are compromised, customers angered, and corporate reputations tarnished or destroyed.
Any backdoor makes us all less secure. As cyber security expert Bruce Schneier points out, you can’t build a backdoor “that only works with proper legal authorization, or only for people with a particular citizenship or the proper morality.” If a backdoor exists, it can be exploited. That’s a risk no company is willing to take.