Thursday, February 25, 2016 [Tweets] [Favorites]

Apple Working on Removing iOS Backdoor

Matt Apuzzo and Katie Benner (comments):

Apple engineers have already begun developing new security measures that would make it impossible for the government to break into a locked iPhone using methods similar to those now at the center of a court fight in California, according to people close to the company and security experts.

John Gruber:

The way the iPhone works today, when put into recovery mode you can restore the operating system without entering the device passcode. The only restriction is that the version of iOS to be installed must be properly signed by Apple.


I think what Apple is leaking here is that they’re going to change this (perhaps as soon as this year’s new iPhone 7), so that you can’t install a new version of iOS, even in recovery mode, without entering the device’s passcode. (I think they will also do the same for firmware updates to the code that executes on the Secure Enclave — it will require a passcode lock.)

If you do a full restore, you can install a new version of the OS without the passcode, but this wipes the data.

It’s understandable that Tim Cook wants the conversation to be about the FBI asking Apple to build a backdoor. But I think a more accurate description is that the backdoor already exists. Apple today could update the OS to remove security protections, without wiping the data. The dispute with the FBI is that Apple doesn’t want to use the backdoor. And now it is working to remove it.

Previously: FBI Asks Apple for Secure Golden Key.

Update (2016-02-29): Leif Ryge (via Alexis Gallagher):

So when Apple says the FBI is trying to “force us to build a backdoor into our products,” what they are really saying is that the FBI is trying to force them to use a backdoor which already exists in their products. (The fact that the FBI is also asking them to write new software is not as relevant, because they could pay somebody else to do that. The thing that Apple can provide which nobody else can is the signature.)

Is it reasonable to describe these single points of failure as backdoors? I think many people might argue that industry-standard systems for ensuring software update authenticity do not qualify as backdoors, perhaps because their existence is not secret or hidden in any way. But in the present Apple case where they are themselves using the word “backdoor,” abusing their cryptographic single point of failure is precisely what the FBI is demanding.

Update (2016-03-03): Alexis Gallagher:

Part of Apple’s defense rests on the fact that they don’t have the passcode, and the FBI is ordering them to create new software. […] What happens to Apple’s legal position if the FBI “only” orders Apple to hand over the signing keys (poor man’s passcode)?


Then the gov't will require they flash the firmware, or require that the hardware with the firmware be swapped out.

But I thought the barrier to entry was the "10 strikes and you're out" with the passcode. Seems like that'll always be the weak link. Make it easier to crack the passcode, and you can do whatever you wantafterwards. At some point, you've got to run "unencrypted" code on the processor, and there's your worst-case attack vector. You can find the passcode without wiping.

It's all zeroes and ones. Until that's changed, there will be ways to exploit hardware in hand. You simply can't make an uncrackable computer, and especially not when the cracker has the source to the OS.

@Ruffin The main barrier is the speed/delay, not the 10 strikes. My understanding is that the speed can be enforced in hardware. I think you are underestimating the secure enclave and how it prevents the processor from dealing with unencrypted data. Please see this post. It seems like to crack it you would need to somehow open it up and read the hardware key with a microscope.

[…] Indeed, the backdoor is already there in that current phones will accept software updates signed by Apple, without wiping the user data. So, in theory, the FBI could simply compel Apple to hand over its signing key and then build itself the tool that it wants. The line of argument about government conscripting Apple engineers to do custom software development is a red herring. […]

Stay up-to-date by subscribing to the Comments RSS Feed for this post.

Leave a Comment