6 October 2014

NSA and FBI May Be Unhappy About Google and Apple Encrypting Smartphone Data, But You Should be Happy

Police want back doors in smartphones, but you never know who else will open them

Craig Timberg
Washington Post
October 2, 2014

The government’s increasingly loud complaints about Apple and Google’s tough new forms of smartphone encryption have sidestepped a crucial fact: The same security measures that make it hard for police to get into electronic devices also deters other – be they foreign governments, business rivals or creepy guys looking to steal your photos and post them on the Internet.

In other words, it’s not technically possible to build a backdoor for the FBI without weakening a smartphone’s security in fundamental ways. Doors are made to be opened, and once they’re built, you never know who might find a way to get in.

Such is the consensus view of security experts. And to hear them tell it, the reality of backdoors is even worse than it may seem at first glance.

Imagine a house made entirely of bricks, with no doors or window. It’s as secure as it will ever be. Now cut a hole into the bricks and install a door. No matter how thick the door or tough the lock, the house is now more vulnerable to intrusion in at least three ways: The door can be battered down. The keys can be stolen. And all the things that make doors work – the hinges, the lock, the door jamb – become targets for attackers. They need to defeat only one to make the whole system fail.

Add to that the inherent bugginess of computer code, and you’ve got a recipe for insecurity.

“You’re certainly weaker than if you hadn’t built the door before,” said Johns Hopkins cryptology expert Matthew Green. “We don’t know how to write perfect code.”

This debate was touched off by Apple and Google’s announcements last month that their newest mobile operating systems, available on the latest iPhones, iPads and Android devices, will automatically protect user data with forms of encryption that the companies are unable to unlock for police – even when they have valid search warrants.

The backlash from law enforcement was immediate and fierce, reaching the point where Attorney General Eric H. Holder Jr. this week publicly urged companies to leave back doors in their devices to protect children from kidnappers and sexual predators. “It is fully possible to permit law enforcement to do its job while still adequately protecting personal privacy,” he said.

Holder’s statement begged the question of what “adequately” means. Because, security experts agree, there is a clear tradeoff: More law enforcement access means less security and, as a result, less privacy.

“It’s not just that somebody is going to use the same back door that law enforcement uses,” said Matthew Blaze, a University of Pennsylvania cryptology expert. “It’s that introducing the back door is very likely to either introduce or exacerbate a flaw in the software.”

The past couple of years have underscored just how flawed software often is. Target, Home Depot, Neiman Marcus and their tens of millions of customers found that out the hard way, as did dozens of Hollywood celebrities who had their intimate personal photos stolen by criminals who reportedly used a hacking tool marketed largely to police worldwide.

The point is that hackers – whether they be creeps, spies or insomniac college students — are inherently clever and relentless. They find the weakest spots and create holes. Back doors provide natural targets for their energies.

“People are starting to be sensitized that there is a right way and a wrong way to do data security,” said Jennifer Granick, director of civil liberties at the Stanford Center for Internet and Society. “We’ve been doing it the wrong way.”

For privacy activists such as Granick, the smartphone encryption debate carries echoes of the fight over the “Clipper Chip,” a Clinton administration idea pushed by the National Security Agency. The chip was an encryption device that would also have provided the government with a backdoor to telecommunications systems, making interception of calls much easier.

The effort eventually failed amid political opposition but not before Blaze – the University of Pennsylvania researcher quoted above – discovered that the “Clipper Chip” produced by the NSA had crucial security flaws. It turned out to be a back door that a skilled hacker could easily break through.

No comments: