Friday, October 10, 2014 [Tweets] [Favorites]

Secure Golden Key

Jonathan Zdziarski:

So Apple fixed their security – so what? Well, they fixed it right… and that means that they fixed it so they, themselves, couldn’t break into it… which is the only way to do encryption right. They can’t break into their own phones, at least without using a password breaking tool. That is significant. So in fixing their security, Apple has now said to law enforcement, “we’re sorry, but we’d have to perform sophisticated attacks against our own products in order to even have a chance at dumping data for you.” What they haven’t said, but is very much also the truth, is “we’ve made our products secure enough so that we can’t even hack them … and can keep you safe from criminals, keep our public officials safe from spy agencies, and can keep our military safe from foreign governments – all looking to spy on, eavesdrop on, steal data from, and learn crucial intelligence to harm America (insert any other country here)”.

Bruce Schneier:

FBI Director James Comey claimed that Apple’s move allows people to “place themselves beyond the law” and also invoked that now overworked “child kidnapper.” John J. Escalante, chief of detectives for the Chicago police department now holds the title of most hysterical: “Apple will become the phone of choice for the pedophile.”

Matthew Green:

Since only the device itself knows UID -- and the UID can’t be removed from the Secure Enclave -- this means all password cracking attempts have to run on the device itself. That rules out the use of FPGA or ASICs to crack passwords. Of course Apple could write a custom firmware that attempts to crack the keys on the device but even in the best case such cracking could be pretty time consuming, thanks to the 80ms PBKDF2 timing.

(Apple pegs such cracking attempts at 5 1/2 years for a random 6-character password consisting of lowercase letters and numbers. PINs will obviously take much less time, sometimes as little as half an hour. Choose a good passphrase!)

The Washington Post:

How to resolve this? A police “back door” for all smartphones is undesirable — a back door can and will be exploited by bad guys, too. However, with all their wizardry, perhaps Apple and Google could invent a kind of secure golden key they would retain and use only when a court has approved a search warrant.

Chris Coyne:

A “golden key” is just another, more pleasant, word for a backdoor—something that allows people access to your data without going through you directly. This backdoor would, by design, allow Apple and Google to view your password-protected files if they received a subpoena or some other government directive.


Apple’s anti-backdoor policy aims to protect everyone. The following is a list of real threats their policy would thwart. Not threats to terrorists or kidnappers, but to 300 million Americans and 7 billion humans who are moving their intimate documents into the cloud. Make no mistake, what Apple and Google are proposing protects you.

Whether you’re a regular, honest person, or a US legislator trying to understand this issue, understand this list.

Update (2014-10-14): Rich Mogull:

Law enforcement, especially federal law enforcement, has a history of desiring and imposing backdoors into technology. The Communications Assistance for Law Enforcement Act (CALEA) of 1994 requires all telecommunications equipment manufacturers to enable remote wiretapping for law enforcement in the hardware. But CALEA backdoors have also been abused by criminals and intelligence agencies.

1 Comment

Stay up-to-date by subscribing to the Comments RSS Feed for this post.

Leave a Comment