SAN FRANCISCO — In the age of mass digital surveillance, how private should your data and communications be? That question lies at the heart of the encryption panel that kicked off the Enigma Conference here yesterday (Jan. 27).
Four cryptography experts discussed the origins of the first "Crypto Wars" in the 1990s, the state of the current Crypto Wars between the government and technology companies — two weeks ago, the U.S. attorney general called out Apple for not unlocking a terror suspect's iPhones — and what's at stake now for consumers, companies and governments.
"It is a basic human right for two people to talk confidentially no matter where they are. This is sacrosanct," said Jon Callas, senior technologist at the American Civil Liberties Union (ACLU) and a veteran of the fight between the U.S. government and tech companies over the use of cryptography to protect digital communications in the 1990s.
It may be a human right, but most countries have not enshrined confidential conversations in their legal codes. What started as a resurgent fight against government surveillance in the wake of the documents leaked by Edward Snowden in 2013 has now bloomed into a larger struggle over who gets to encrypt communications and data.
In Snowden’s wake, end-to-end encrypted messaging has become far more accessible, while Apple and Google have introduced on-device encrypted data storage by default. But access to those services could soon depend on which country you are in and whose digital services you're using.
Location, location, location?
The 1990s Crypto Wars centered on the Clipper Chip, a hardware chip designed to protect phone users’ calls from surveillance — unless the government wanted to listen in. It was a "backdoor" that was going to be built into every cellphone.
But in 1994, cryptographer Matt Blaze, one of the panelists at yesterday's Enigma Conference talk, exposed security vulnerabilities in the Clipper Chip. Experts spent the next three years finding even more vulnerabilities in the Clipper Chip and fighting in court to prevent its inclusion in devices.
Since the commercial internet was in its infancy at the time, legal and computer security experts had to take on faith that the World Wide Web would eventually be important, Blaze said. With the publication in 1997 of a report on the risks of key recovery that Blaze co-authored, most U.S. federal agencies stopped fighting against the cryptographers.
"The FBI became the only organization arguing that computer security was too good," Blaze said.
Today, government access to encrypted communications through a mandated backdoor is not the law of the land in any single country. But laws requiring varying degrees of government access to encrypted communications are becoming more common, said panelist Riana Pfefferkorn, associate director of surveillance and cybersecurity at the Stanford Law School Center for Internet and Society.
Following the panel discussion, Pfefferkorn said she sees a growing trend, especially in the United States and India, to tie serious liability issues, in both criminal and civil law, to the encryption debate.
"In the U.S., it's child pornography. In India, it's the threat of mob violence," Pfefferkorn said. "They seem like two separate issues, but they're a way of encouraging the regulation of encryption without regulating encryption.
"They're going to induce providers to stop deploying end-to-end encryption lest they face ruinous litigation," she added. "It feels like a bait-and-switch."
Vulnerability or backdoor?
Daniel Weitzner, the founding director of the Internet Policy Research Initiative at the Massachusetts Institute of Technology, noted during the panel that India's proposed changes to its intermediary liability law would make internet communications providers ("intermediaries") legally responsible for the actions and speech of their users.
He said India's proposals are similar to changes demanded by U.S. senators, including the EARN IT Act of 2019 authored by Senators Lindsey Graham (R-South Carolina) and Richard Blumenthal (D-Connecticut). Weitzner added that there are other countries with even tougher tech-liability laws on the books.
The United Kingdom passed the Investigative Powers Act in 2016, also known as the Snoopers' Charter. It lets the British government issue statutorily vague Technical Capacity Notices that let it mandate encryption backdoors or otherwise force companies to stop using end-to-end encryption. There's no requirement that the British government has to ever reveal the results of the evaluation process guiding the issuance of the notices.
Australia's Assistance and Access Bill from 2018 is similar, except that it specifically bans the introduction of systemic vulnerabilities into the product in question. What's not clear is another question raised by the legal mandate: What’s the difference between a technical vulnerability and a legally-mandated software backdoor?
How will tech companies react?
As technology itself has grown more complicated and nuanced since the 1990s, so has the burden of responsibility facing its advocates. Proposals to change encryption should be tested "multiple times" strategically and technically, argued the Carnegie Encryption Working Group in September 2019.
And Susan Landau and Denis McDonough said in a column for The Hill that it would be wiser for the tech community to find common ground with governments over data at rest, such as data stored on a locked iPhone, instead of the more contentious data in transit embodied by end-to-end encrypted messaging apps.
Ultimately, the future of the consumer use of encryption is likely to depend heavily on the developers and companies that make it available.
They could split their products, offering different levels of encryption for different countries and regions, as Netscape did in the 1990s, said Pfefferkorn. Or they could refuse to offer encrypted products in countries or regions that demand weaker encryption or backdoor access.
"Or," Pfefferkorn said, "it could be broken for everyone."