26 April 2019

CRACKING THE CRYPTO WAR


ON DECEMBER 2, 2015, a man named Syed Rizwan Farook and his wife, Tashfeen Malik, opened fire on employees of the Department of Public Health in San Bernardino, California, killing 14 people and injuring 22during what was supposed to be a staff meeting and holiday celebration. The shooters were tracked down and killed later in the day, and FBI agents wasted no time trying to understand the motivations of Farook and to get the fullest possible sense of his contacts and his network. But there was a problem: Farook’s iPhone 5c was protected by Apple’s default encryption system. Even when served with a warrant, Apple did not have the ability to extract the information from its own product.

The government filed a court order, demanding, essentially, that Apple create a new version of the operating system that would enable it to unlock that single iPhone. Apple defended itself, with CEO Tim Cook framing the request as a threat to individual liberty.


“We have a responsibility to help you protect your data and protect your privacy,” he said in a press conference. Then-FBI chief James Comey reportedly warned that Cook’s attitude could cost lives. “I just don’t want to get to a day where people look at us with tears in their eyes and say, ‘My daughter is missing and you have her cell phone—what do you mean you can’t tell me who she was ­texting before she disappeared?’ ” The controversy over Farook’s iPhone reignited a debate that was known in the 1990s as the Crypto Wars, when the government feared the world was “going dark” and tried—and ultimately failed—to impede the adoption of technologies that could encode people’s information. Only this time, with super­computers in everybody’s pockets and the endless war on terror, the stakes were higher than ever.

A few months after the San Bernardino shooting, President Obama sat for an interview at the South by Southwest conference and argued that government officials must be given some kind of shortcut—or what’s known as exceptional access—to encrypted content during criminal and antiterrorism investigations. “My conclusion so far is that you cannot take an absolutist view on this,” he said. “If the tech community says, ‘Either we have strong, perfect encryption or else it’s Big Brother and an Orwellian world’—what you’ll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed, and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties.”

In typical Obama fashion, the president was leaning toward a compromise, a grand bargain between those who insist that the NSA and FBI need all the information they can get to monitor potential terrorists or zero in on child abusers and those who believe building any sort of exceptional access into our phones would be a fast track to a totalitarian surveillance state. And like so many of Obama’s proposed compromises, this one went nowhere. To many cryptographers, there was simply no way that companies like Apple and Google could provide the government with legal access to customer data without compromising personal privacy and even national security. Exceptional access was a form of technology, after all, and any of its inevitable glitches, flaws, or bugs could be exploited to catastrophic ends. To suggest otherwise, they argued, was flat wrong. Flat-Earth wrong. Which was, as any good engineer or designer knows, an open invitation for someone to prove them wrong.

THIS PAST JANUARY, Ray Ozzie took a train from his home in Massachusetts to New York City for a meeting in a conference room of the Data Science Institute at Columbia University. The 14th-­floor aerie was ringed by wide windows and looked out on a clear but chilly day. About 15 people sat around the conference table, most of them middle-­aged academics—people from the law school, scholars in government policy, and computer scientists, including cryptographers and security specialists—nibbling on a light lunch while waiting for Ozzie’s presentation to begin.

Jeannette Wing—the host of the meeting and a former corporate VP of Microsoft Research who now heads the Data Science Institute—introduced Ozzie to the group. In the invitation to this “private, informal session,” she’d referenced his background, albeit briefly. Ozzie was once chief technical officer at Microsoft as well as its chief software architect, posts he had assumed after leaving IBM, where he’d gone to work after the company had purchased a product he created, Lotus Notes. Packed in that sentence was the stuff of legend: Notes was a groundbreaking product that rocketed businesses into internet-style communications when the internet was barely a thing. The only other person who ever held the chief software architect post at Microsoft was Bill Gates, and Ozzie had also helped create the company’s cloud business.

He had come to Columbia with a proposal to address the impasse over exceptional access, and the host invited the group to “critique it in a constructive way.” Ozzie, trim and vigorous at 62, acknowledged off the bat that he was dealing with a polarizing issue. The cryptographic and civil liberties community argued that solving the problem was virtually impossible, which “kind of bothers me,” he said. “In engineering if you think hard enough, you can come up with a solution.” He believed he had one.

He started his presentation, outlining a scheme that would give law enforcement access to encrypted data without significantly increasing security risks for the billions of people who use encrypted devices. He’d named his idea Clear.

It works this way: The vendor—say it’s Apple in this case, but it could be Google or any other tech company—starts by generating a pair of complementary keys. One, called the vendor’s “public key,” is stored in every iPhone and iPad. The other vendor key is its “private key.” That one is stored with Apple, protected with the same maniacal care that Apple uses to protect the secret keys that certify its operating system updates. These safety measures typically involve a tamper-­proof machine (known as an HSM or hardware security module) that lives in a vault in a specially protected building under biometric lock and smartcard key.

That public and private key pair can be used to encrypt and decrypt a secret PIN that each user’s device automatically generates upon activation. Think of it as an extra password to unlock the device. This secret PIN is stored on the device, and it’s protected by encrypting it with the vendor’s public key. Once this is done, no one can decode it and use the PIN to unlock the phone except the vendor, using that highly protected private key.

So, say the FBI needs the contents of an iPhone. First the Feds have to actually get the device and the proper court authorization to access the information it contains—Ozzie’s system does not allow the authorities to remotely snatch information. With the phone in its possession, they could then access, through the lock screen, the encrypted PIN and send it to Apple. Armed with that information, Apple would send highly trusted employees into the vault where they could use the private key to unlock the PIN. Apple could then send that no-longer-secret PIN back to the government, who can use it to unlock the device.

Ozzie designed other features meant to ­reassure skeptics. Clear works on only one device at a time: Obtaining one phone’s PIN would not give the authorities the means to crack anyone else’s phone. Also, when a phone is unlocked with Clear, a special chip inside the phone blows itself up, freezing the contents of the phone thereafter. This prevents any tampering with the contents of the phone. Clear can’t be used for ongoing surveillance, Ozzie told the Columbia group, because once it is employed, the phone would no longer be able to be used.

He waited for the questions, and for the next two hours, there were plenty of them. The word risk came up. The most dramatic comment came from computer science professor and cryptographer Eran Tromer. With the flair of Hercule Poirot revealing the murderer, he announced that he’d discovered a weakness. He spun a wild scenario involving a stolen phone, a second hacked phone, and a bank robbery. Ozzie conceded that Tromer found a flaw, but not one that couldn’t be fixed.

At the end of the meeting, Ozzie felt he’d gotten some good feedback. He might not have changed anyone’s position, but he also knew that unlocking minds can be harder than unlocking an encrypted iPhone. Still, he’d taken another baby step in what is now a two-years-and-counting quest. By focusing on the engineering problem, he’d started to change the debate about how best to balance privacy and law enforcement access. “I do not want us to hide behind a technological smoke screen,” he said that day at Columbia. “Let’s debate it. Don’t hide the fact that it might be possible.”

In his home office outside Boston, Ray Ozzie works on a volunteer project designing and making safety-testing kits for people in nuclear radiation zones.

COLE WILSON

The first, and most famous, exceptional-access scheme was codenamed Nirvana. Its creator was an NSA assistant deputy director named Clinton Brooks, who realized in the late 1980s that newly discovered advances in cryptography could be a disaster for law enforcement and intelligence agencies. After initial despair, Brooks came up with an idea that he envisioned would protect people’s privacy while preserving government’s ability to get vital information. It involved generating a set of encryption keys, unique to each device, that would be held by government in heavily protected escrow. Only with legal warrants could the keys be retrieved and then used to decode encrypted data. Everyone would get what they wanted. Thus … Nirvana.

The plan was spectacularly botched. Brooks’ intent was to slowly cook up an impervious technical framework and carefully introduce it in the context of a broad and serious national discussion about encryption policy, where all stakeholders would hash out the relative trade-offs of law enforcement access to information and privacy. But in 1992, AT&T developed the Telephone Security Device 3600, which could scramble phone conversations. Its strong encryption and relatively low price unleashed a crypto panic in the NSA, the FBI, and even the tech-friendly officials in the new Clinton administration. Then the idea came up of using Brooks’ key escrow technology, which by that time was being implemented with a specialized component called the Clipper Chip, to combat these enhanced encryption systems. After a few weeks, the president himself agreed to the plan, announcing it on April 16, 1993.

All hell broke loose as technologists and civil libertarians warned of an Orwellian future in which the government possessed a backdoor to all our information. Suddenly the obscure field of cryptography became a hot button. (I still have a T-shirt with the rallying cry “Don’t Give Big Brother a Master Key.”) And very good questions were raised: How could tech companies sell their wares overseas if foreign customers knew the US could get into their stuff? Wouldn’t actual criminals use other alternatives to encrypt data? Would Clipper Chip technology, moving at government speed, hobble the fast-moving tech world?

Ultimately, Clipper’s death came not from policy, but science. A young Bell Labs cryptographer named Matt Blaze discovered a fatal vulnerability, undoubtedly an artifact of the system’s rushed implementation. Blaze’s hack led the front page of The New York Times. The fiasco tainted all subsequent attempts at installing government backdoors, and by 1999, most government efforts to regulate cryptography had been abandoned, with barely a murmur from the FBI or the NSA.

For the next dozen or so years, there seemed to be a Pax Cryptographa. You seldom heard the government complain about not having enough access to people’s personal information. But that was in large part because the government already had a frightening abundance of access, a fact made clear in 2013 by Edward Snowden. When the NSA contractor revealed the extent of his employer’s surveillance capabilities, people were shocked at the breadth of its activities. Massive snooping programs were sweeping up our “metadata”—who we talk to, where we go—while court orders allowed investigators to scour what we stored in the cloud. The revelations were also a visceral blow to the leaders of the big tech companies, who discovered that their customers’ data had essentially been plundered at the source. They vowed to protect that data more assiduously, this time regarding the US government as one of their attackers. Their solution: encryption that even the companies themselves could not decode. The best example was the iPhone, which encrypted users’ data by default with iOS 8 in 2014.

Law enforcement officials, most notably Comey of the FBI, grew alarmed that these heightened encryption schemes would create a safe haven for crooks and terrorists. He directed his staff to look at the potential dangers of increasing encryption and began giving speeches that called for that blast from the past, lingering like a nasty chord from ’90s grunge: exceptional access.

The response from the cryptographic community was swift and simple: Can’t. Be. Done. In a landmark 2015 paper called “Keys Under Doormats,” a group of 15 cryptographers and computer security experts argued that, while law enforcement has reasons to argue for access to encrypted data, “a careful scientific analysis of the likely impact of such demands must distinguish what might be desirable from what is technically possible.” Their analysis claimed that there was no foreseeable way to do this. If the government tried to implement exceptional access, they wrote, it would “open doors through which criminals and malicious nation-states can attack the very individuals law enforcement seeks to defend.”

The 1990s Crypto Wars were back on, and Ray Ozzie didn’t like what he was hearing. The debate was becoming increasingly politicized. Experts in cryptography, he says, “were starting to pat themselves on the back, taking extreme positions about truisms that weren’t so obvious to me.” He knew that great achievements of cryptography had come from brilliant scientists using encryption protocols to perform a kind of magic: sharing secrets between two people who had never met, or creating digital currency that can’t be duplicated for the purposes of fraud. Could a secure system of exceptional access be so much harder? So Ozzie set out to crack the problem. He had the time to do it. He’d recently sold a company he founded in 2012, Talko, to Microsoft. And he was, to quote a friend, “post-economic,” having made enough money to free him from financial concerns. Working out of his home north of Boston, he began to fool around with some ideas. About two weeks later, he came up with Clear.

1/6Inside Ray Ozzie's home office in Manchester, Massachusetts. Ozzie bought this 128k Mac in 1984 so he could access the floppy disk that contained the original UI designs for Lotus Notes.Cole Wilson

THE STRENGTH OF Ozzie’s system lies in its simplicity. Unlike Clinton Brooks, who relied on the government to safeguard the Clipper Chip’s encrypted keys, Ozzie is putting his trust in corporations, a decision that came from his experience in working for big companies like Lotus, IBM, and Microsoft. He was intimately familiar with the way that tech giants managed their keys. (You could even argue that he helped invent that structure, since Lotus Notes was the first software product to get a license to export strong encryption overseas and thus was able to build it into its products.) He argues that the security of the entire mobile universe already relies on the protection of keys—those vital keys used to verify operating system updates, whose compromise could put billions of users at risk. (Every time you do an OS update, Apple certifies it by adding a unique ID and “signing” it to let your device know it’s really Apple that is rewriting your iPhone’s code.) Using that same system to provide exceptional access, he says, introduces no new security weaknesses that vendors don’t already deal with.

Ozzie knew that his proposal danced on the third rail of the crypto debate—many before him who had hinted at a technical solution to exceptional access have been greeted with social media pitchforks. So he decided to roll out his proposal quietly, showing Clear to small audiences under an informal nondisclosure agreement. The purpose was to get feedback on his system, and, if he was lucky, to jar some people out of the mindset that regarded exceptional access as a crime against science. His first stop, in September 2016, was in Seattle, where he met with his former colleagues at Microsoft. Bill Gates greeted the idea enthusiastically. Another former colleague, Butler Lampson—a winner of the Turing Award, the Nobel Prize of computer science—calls the approach “completely reasonable … The idea that there’s no way to engineer a secure way of access is ridiculous.” (Microsoft has no formal comment.)

Ozzie went on to show Clear to representatives from several of the biggest tech companies—Apple, Google, Facebook—none of whom had any interest whatsoever in voluntarily implementing any sort of exceptional access. Their focus was to serve their customers, and their customers want security. (Or, as Facebook put it in a statement to WIRED: “We have yet to hear of a technical solution to this challenge that would not risk weakening security for all users.”) At one company, Ozzie squared off against a technical person who found the proposal offensive. “I’ve seen this happen to engineers a million times when they get backed into a corner,” Ozzie says. “I told him ‘I’m not saying you should do this. I’m trying to refute the argument that it can’t be done.’ ”

Unsurprisingly, Ozzie got an enthusiastic reception from the law enforcement and intelligence communities. “It’s not just whether his scheme is workable,” says Rich Littlehale, a special agent in the Tennessee Bureau of Investigation. “It’s the fact that someone with his experience and understanding is presenting it.” In an informal meeting with NSA employees at its Maryland headquarters, Ozzie was startled to hear that the agency had come up with something almost identical at some point. They’d even given it a codename.

During the course of his meetings, Ozzie learned he was not alone in grappling with this issue. The names of three other scientists working on exceptional access popped up—Ernie Brickell, Stefan Savage, and Robert Thibadeau—and he thought it might be a good idea if they all met in private. Last August the four scientists gathered in Meg Whitman’s boardroom at Hewlett Packard Enterprise in Palo Alto. (Ozzie is a board member, and she let him borrow the space.) Though Thibadeau’s work pursued a different course, Ozzie found that the other two were pursuing solutions similar to his. What’s more, Savage has bona fides to rival Ozzie’s. He’s a world-­renowned expert on security research, and he and Ozzie share the same motivations. “We say we are scientists, and we let the data take us where they will, but not on this issue,” Savage says. “People I very much respect are saying this can’t be done. That’s not why I got into this business.”

Ozzie’s efforts come as the government is getting increasingly desperate to gain access to encrypted information. In a speech earlier this year, FBI director Christopher Wray said the agency was locked out of 7,775 devices in 2017. He declared the situation intolerable. “I reject this notion that there could be such a place that no matter what kind of lawful authority you have, it’s utterly beyond reach to protect innocent citizens,” he said.

Deputy attorney general Rod Rosenstein, in a speech at the Naval Academy late last year, was even more strident. “Warrant-proof encryption defeats the constitutional balance by elevating privacy above public safety,” he said. What’s needed, he said, is “responsible encryption … secure encryption that allows access only with judicial authorization.”

After a mass shooting in California, the Feds file a court order against Apple to access the contents of a shooter’s phone.

Since Apple, Google, Facebook, and the rest don’t see much upside in changing their systems, only a legislative demand could grant law enforcement exceptional access. But there doesn’t seem to be much appetite in Congress to require tech companies to tailor their software to serve the needs of law enforcement agencies. That might change in the wake of some major incident, especially if it were discovered that advance notice might have been gleaned from an encrypted mobile device.

As an alternative to exceptional access, cryptographers and civil libertarians have begun promoting an approach known as lawful hacking. It turns out that there is a growing industry of private contractors who are skilled in identifying flaws in the systems that lock up information. In the San Bernardino case, the FBI paid a reported $900,000 to an unnamed contractor to help them access the data on Farook’s iPhone. Many had suspected that the mysterious contractor was an Israeli company called Cellebrite, which has a thriving business in extracting data from iPhones for law enforcement agencies. (Cellebrite has refused to confirm or deny its involvement in the case, and its representatives declined to comment for this story.) A report by a think tank called the EastWest Institute concluded that other than exceptional access, lawful hacking is the only workable alternative.

But is it ethical? It seems odd to have security specialists promoting a system that depends on a reliable stream of vulnerabilities for hired hackers to exploit. Think about it: Apple can’t access its customers’ data—but some random company in Israel can fetch it for its paying customers? And with even the NSA unable to protect its own hacking tools, isn’t it inevitable that the break-in secrets of these private companies will eventually fall into the hands of criminals and other bad actors? There is also a danger that forces within the big tech companies could enrich themselves through lawful hacking. As one law enforcement official pointed out to me, lawful hacking creates a marketplace for so-called zero-day flaws—vulnerabilities discovered by outsiders that the manufacturers don’t know about—and thus can be exploited by legal and nonlegal attackers. So we shouldn’t be surprised if malefactors inside tech companies create and bury these trapdoors in products, with hopes of selling them later to the “lawful hackers.”

Lawful hacking is techno-capitalism at its shadiest, and, in terms of security alone, it makes the mechanisms underlying Clear (court orders, tamper­-proof contents) look that much more appealing. No matter where you stand in the crypto debate, it makes sense that a carefully considered means of implementing exceptional access would be far superior to a scheme that’s hastily concocted in the aftermath of a disaster. (See Clipper.) But such an approach goes nowhere unless people believe that it doesn’t violate math, physics, and Tim Cook’s vows to his customers. That is the bar that Ozzie hopes he can clear.

The “Keys Under Doormats” gang has raised some good criticisms of Clear, and for the record, they resent Ozzie’s implication that their minds are closed. “The answer is always, show me a proposal that doesn’t harm security,” says Dan Boneh, a celebrated cryptographer who teaches at Stanford. “How do we balance that against the legitimate need of security to unlock phones? I wish I could tell you.”

One of the most salient objections goes to the heart of Ozzie’s claim that his system doesn’t really increase risk to a user’s privacy, because manufacturers like Apple already employ intricate protocols to protect the keys that verify its operating system updates. Ozzie’s detractors reject the equivalence. “The exceptional access key is different from the signing key,” says Susan Landau, a computer scientist who was also a ­coauthor of the “Doormat” paper. “A signing key is used rarely, but the exceptional access key will be used a lot.” The implication is that setting up a system to protect the PINs of billions of phones, and process thousands of requests from law enforcement, will inevitably have huge gaps in security. Ozzie says this really isn’t a problem. Invoking his experience as a top executive at major tech firms, he says that they already have frameworks that can securely handle keys at scale. Apple, for example, uses a key system so that t

No comments: