19 August 2015

Technology vs. Policy: How Legally Mandated Backdoors Compromise Security

August 18, 2015

The increasing demand for surveillance-proof computing has led to more advanced forms of encryption. Most notably, in 2014, Apple released anoperating system that cannot be unlocked; even with a lawful warrant, Apple itself lacks the technological capacity to crack into a password-locked device. Google announced their plans for encryption the next day. Pre-existing systems that use whole disk or end-to-end encryption are also rising in popularity.

Law enforcement broadly and the Justice Department specifically were not pleased with this development in technology. In an effort spearheaded by FBI director James Comey and Deputy Attorney General Sally Quinlan Yates, the U.S. government is trying to expand its capacity to compel tech giants like Apple and Google to develop a so-called “backdoor” into their encrypted devices.

A question of perspective: Technology versus policy


Naturally, DoJ’s argument raises someconcerns among a wide array of policy makers and privacy advocates. The notion that we would compromise security, potentially across the entire population of internet users, in order to marginally increase the effectiveness of law enforcement does not sit well with some audiences. The broad consensus of technology experts is that the capacity for government access to otherwise entirely encrypted systems would weaken the effectiveness of those systems.

Most notably, a group of experts in computer security wrote a report that was then published by MIT objecting to such mandatory government access. They argued that “the complexity of today’s Internet environment, with millions of apps and globally connected services, means that new law enforcement requirements are likely to introduce unanticipated, hard to detect security flaws.” From a technological standpoint, their case is hard to refute. The proposed backdoor capability does introduce a vulnerability to modern encryption practices.

However, from a policy standpoint, others have argued that the advances in security may be worth the risk. A common argument appeals to emotionally fraught stories of criminals that evade prosecution because the necessary evidence is unobtainable, behind a wall of technological protection. Some analysts have suggested, pragmatically, that criminals and other nefarious persons will find other mechanisms to encrypt their data.

On a more nuanced level, analysts have made the point that even encrypted systems are likely to transmit their data to an outside system. Think, for example, of an iCloud backup; an iPhone may be totally secure even from Apple itself, but if its files, photos, contacts, messages, etc. are backed up on iCloud, Apple does still have the technological capacity to access that information (which they do not share, as a matter of policy). This theoretical vulnerability has yet to lead us to disaster, so why should we assume that similar access to the phone itself would prove a significantly greater threat?

An uncomfortable policy

One particularly perplexing aspect of the encryption debate is that the government genuinely does have legal precedent for their request. According to a 1994 surveillance law called Communications Assistance for Law Enforcement (CALEA), the government can require telecommunications companies to build surveillance capabilities into their infrastructure. More recent expansions of the law include access to Voice over Internet Protocol (VoIP) and some broadband providers. Current access under CALEA, however, does stop short at internet platform companies. Presumably, the expansion into that territory is the basis for Comey’s campaign.

The underlying premise of CALEA (and its possible expansion) is a bit hard to digest because the law is rather counterintuitive to most understandings of the limitations placed on law enforcement. As former judge and Secretary of Homeland Security Michael Chertoff pointed out at this year’s Aspen Security Forum, “…we do not historically organize our society to make it maximally easy for law enforcement, even with court orders, to get information.” Benjamin Wittes of Lawfare Blog made a similar observation: “the Fourth Amendment is not, in fact, a guarantee of the authority to surveil with a warrant. It is prohibition (generally speaking) of surveillance without one.”

While the public might find it generally understandable that law enforcement could access a suspect’s data after obtaining a warrant, requiring tech companies to bake the capacity for surveillance into their products—particularly at the expense of their products’ effectiveness—is a step beyond the realm of public comfort. And it is not the first time this conversation has been had.

The introduction of readily available encryption for public use in the mid-90s generated concerns about cryptography among policy makers and law enforcement, an appeal to the private sector and a very similar backlash from the public (often from the same voices active in today’s debate). Ultimately, popular opinion won out, and the proposed measure petered out rather ignominiously by the late-90s, which is hardly an encouraging precedent for today’s policy makers.

Evaluating the costs of access

In 2014, security law professor Orin Kerr asked, “How is the public interest served by a policy that only thwarts lawful search warrants?” (He publicly refined his views shortly thereafter.) He was making the case that the requirement for access to an encrypted system is not fundamentally different from any other execution of search and seizure under the Fourth Amendment. However, the problem with Kerr’s logic is that there are additional exogenous costs to this particular policy. That is to say, the privacy risks involved affect more than just the subjects of lawful warrants.

If we think of encryption as a public good that protects us from the erosion of privacy at the hands of corporate, state and individual actors, then introducing any vulnerability into secure encryption devalues the public good for all users, not just the subjects of lawful warrants. In this sense, the expansion of CALEA to include a broader range of technologies is fundamentally different from previous expansions of law; its potential consequences could be drastically more costly.

Exactly how costly this security vulnerability may eventually be is unclear for two reasons. For one, we do not know the specifics of the security involved. Researchers could quantify the degree of risk by using statistical frameworks to estimate how much time, in expectation, it would take to crack an encrypted system (e.g. the time needed for a nefarious hacker to access a system through a hypothetical government-mandated backdoor). But in order to make such an estimate, researchers would need to know the specifics of the systems involved, and since the government has suggested that companies design the backdoor to maximize security within their own system, researchers are rather light on the details of those systems.

Secondly, advancements in computing make the specifics of future security systems unknowable. Encryption algorithms (and particularly, the methods and computational power used for breaking them) have changed drastically over a relatively short period of time, so from the perspective of policy makers today, the future costs of legislation mandating a backdoor are nearly impossible to predict.

DoJ and the FBI have precedent for their backdoor request based on CALEA and a history of mandatory coordination enabling surveillance, but the current issue is different from previous cases because of its potential costs, a fact that has real effects on the government’s relationship with the private sector. It is clear that technology is quickly outpacing the capabilities of law enforcement and defense. Quite sensibly, government has responded by enlisting the support of the private sector. However, like any relationship, the public-private partnership is transactional. Whether bound by law or not, there is a limit to how much government can push the private sector in producing unpalatable or unfeasible products. Pushing this issue will have consequences to the health of that partnership the next time the government needs private sector buy-in. Ultimately, Director Comey should think carefully about whether the short-term gain of backdoor access is worth the long-term potential consequences.

Laura K. Bate is an assistant director and program associate at The Center for the National Interest.

No comments: