Microsoft inadvertently proved why Apple's firm stance against unlocking an iPhone that belonged to one of the San Bernardino terrorists was the correct one. Apple's decision renewed the argument over how best to help law enforcement agencies ensure our collective security without violating an individual's right to privacy. Actually, that debate overshadowed a key reason why encryption backdoors are a bad idea — eventually, they will be discovered by the wrong people.
In August 2016, Microsoft accidentally leaked the "golden key" to its Secure Boot firmware, effectively allowing criminals to exploit the mistake to load malware onto any Windows device. The problem is backdoors for some invariably will mean backdoors for all, including repressive regimes, malicious insiders, foreign spies, and criminal hackers. As the world's leading cryptographers say, backdoors in encryption, authentication systems, or any element of security would subvert their effectiveness by introducing enormous risk of exploitation. And backdoors in reputable commercial software would not prevent bad actors from finding alternative forms of encryption to hide their activities.
There are other factors that support this position:
US intelligence and law enforcement communities still wrongly believe that encryption technologies handicap their investigations. They worry that end-to-end encryption in certain applications and on mobile devices lets terrorists and criminals conceal their communications from surveillance.
That argument fails when you consider that even in the absence of backdoors, our online activity leaves extensive digital exhaust, referred to as metadata, which can be used once legally obtained by law enforcement. Metadata is "data about data" — for example, a record that a chat conversation took place, rather than the contents of the conversation. While metadata discloses a lot less than actual data, it still discloses more than some would like.
This controversy was recently highlighted by The Intercept, which showed how Apple logs iMessage contacts and could share that information with police. But the collection of metadata isn't new and is fairly functionally essential to "critical" transactional systems; operations require logging and auditing, and telemetry and metadata are frequently analyzed to improve products and services. The combination of such metadata and lawful requests for assistance to technology and infrastructure companies could provide a trove of information without compromising the inherent security of products and services used daily by citizens who have not exceeded some appropriate threshold of probable cause. Furthermore, terrorist organizations and rogue nation-states are sophisticated when it comes to developing and using technology. There's nothing to stop them from creating their own encryption technologies that can't be cracked by law enforcement or tech companies, leaving only the law-abiding with the backdoored implementations.
Defending the right to privacy requires us to not only lobby against passage of legislation but also identify alternatives — ones with fewer societal costs — for law enforcement to use while working to identify and apprehend terrorists and other criminals. Law enforcement should be able to use legal hacking, with these two key stipulations:
Government agencies must realize that a backdoor for one is a backdoor for all. Backdoors violate the public's trust and can help, not handicap, terrorists. For the same reason, security companies shouldn't build backdoors into their software — that would leave hospitals, businesses, banks, and consumers vulnerable. The approach should be to lawfully use technology to collect and analyze the ever-growing volumes of data that terrorists and other criminals create when they use social media networks, instant messaging clients, email, and even online video game chat rooms to distribute propaganda.