Wednesday, April 17, 2019

The Time Tim Cook Stood His Ground Against the FBI

Leander Kahney (Hacker News):

Cook was very concerned about how Apple would be perceived throughout this media firestorm. He wanted very much to use it as an opportunity to educate the public about personal security, privacy, and encryption. “I think a lot of reporters saw a new version, a new face of Apple,” said the PR person, who asked to remain anonymous. “And it was Tim’s decision to act in this fashion. Very different from what we have done in the past. We were sometimes sending out emails to reporters three times a day on keeping them updated.”

[…]

Privacy advocates celebrated the end of the case and Apple’s apparent victory. “The FBI’s credibility just hit a new low,” said Evan Greer, campaign director for Fight for the Future, an activist group that promotes online privacy. “They repeatedly lied to the court and the public in pursuit of a dangerous precedent that would have made all of us less safe. Fortunately, internet users mobilized quickly and powerfully to educate the public about the dangers of backdoors, and together we forced the government to back down.”

But Cook was personally disappointed that the case didn’t come to trial. Even though Apple had “won” and wouldn’t be forced to create the backdoor, nothing had really been resolved. “Tim was a little disappointed that we didn’t get a resolution,” said Sewell. He “really felt it would have been fair and it would have been appropriate for us to have tested these theories in court. . . . [Though] the situation that was left at the end of that was not a bad one for us, he would have preferred to go ahead and try the case.”

I still think this story has been mostly misreported in that Apple already had a backdoor to access Syed Farook’s iPhone 5c. Commenter lern_too_spel:

What really happened is that Apple loudly proclaimed that they had made it impossible to comply with government data requests and even had a marketing page masquerading as a privacy page explaining that. The FBI asked Apple to put a build on a phone that would allow them to brute force the passcode, leaving the device and the build on Apple’s premises the entire time. This showed that Apple’s claim was false in practice. Apple quickly removed that marketing page in the wake of the news.

[…]

At the time Apple made the false marketing claims, no passcode was required to install a signed build. Hence, the FBI’s request.

The FBI was asking for no more than what Apple could already do, and it was letting Apple control the whole process. The problem was that what Apple could already do disagreed with what Apple told its customers that it could do.

Previously:

11 Comments RSS · Twitter



Lily Ballard

I thought Apple was clear in that they thought merely creating a build like that was far too dangerous, even if it was supposed to never leave Apple campus. The only way to truly ensure that build never leaks is to never produce it in the first place. And of course phones with a secure enclave literally cannot be upgraded without the passcode, it's just that the 5C didn't have a secure enclave.


@Lily There are several different issues here. I agree that Apple was clear about the part that you mention. The parts I take issue with are that (a) there was all this talk about creating a backdoor, but the backdoor (being able to install a special build that could access the user data) already existed (when there was no Secure Enclave), and (b) Apple had previously said that it was not possible for them to access your data:

On devices running iOS 8, your personal data such as photos, messages (including attachments), email, contacts, call history, iTunes content, notes, and reminders is placed under the protection of your passcode. Unlike our competitors, Apple cannot bypass your passcode and therefore cannot access this data. So it’s not technically feasible for us to respond to government warrants for the extraction of this data from devices in their possession running iOS 8.

and this was clearly false. About six months later, they got the government request and the reason this is a story is because they could have extracted the data. And then (c), my common refrain that much of this is irrelevant because in most cases the data is in iCloud backups, which are easy for Apple to access, no special build required, but of course neither Apple nor the FBI really wanted to talk about that.


Ghost Quartz

> the backdoor (being able to install a special build that could access the user data) already existed

I think calling something a backdoor implies that it is a vulnerability that was intentionally introduced for the purposes of weakening device security. Who is to say whether the vulnerability in question (installing firmware upgrades without user consent such that protection against bruteforcing can be bypassed) was intentional or an oversight? Would you consider a memory validation bug that bypasses these bruteforce protections to be a backdoor as well?

> much of this is irrelevant because in most cases the data is in iCloud backups, which are easy for Apple to access

This is absolutely relevant: I can (and do) choose to forego iCloud backups, but there’s nothing I can do to mitigate the risk of ineffective passcode security. Apple even argued during this whole debacle that they aren’t trying to impede FBI investigations because they are willing and able to comply with lawful requests to access data stored on iCloud servers, but that they the draw the line at creating firmware that exploits this vulnerability.


Even if the capability existed, the legal issue of the government forcing Apple to use its resources (create a special build) remains a big constitutional question that remains unanswered since the case was dropped. Conscripting private citizens or private entities to perform government functions: that’s the big question. Should that be allowed?

Both sides wanted this fight and the government blinked at the last second, owing partly to their admission they had not exhausted all other routes of investigation (third party tools, iCloud backups, etc).

Whether Apple had the capability or not is not the big question in my mind, unless you think their march towards tighter security is somehow disingenuous. Tight iOS security has been a decade in the making and not all or nothing. If they misrepresented the capabilities of the hardware to load or not load custom firmware seems to be the crux of some people’s issue but that is not a legal argument to force Apple to do something. It’s simply a predicate on which the core decision (force Apple to do investigative work) would have made the job easier or not for Apple.

I’m not a lawyer, and it has been a while since this issue came to pass. Hopefully the points above are helpful to the discussion.


@Ghost Yes, a bug (that wasn’t intentionally added) is not a backdoor. I don’t think this was a bug or oversight. My guess is that Apple wanted to avoid this vulnerability but didn’t have a good way of doing so until the Secure Enclave. I don’t blame them for that. The point is that the rhetoric was misleading because the vulnerability was already there and known.

There was all this talk about not wanting a master key to exist, but it already did: the existing source code. If that were to leak, other groups could just build the master key themselves. So maybe there is a moral line that they didn’t want to cross, but I don’t think there’s so much difference, practically speaking.

I mean irrelevant to the privacy of the average iPhone user (who uses iCloud). There was already a master key for iCloud. Is there really that much difference between having one kind of master key vs. another that exists (hopefully) only on Apple’s premises?

@Kyle Yeah, there are definitely legal and moral questions here. Those are not what I’m addressing in this post. I’m saying that Apple misrepresented what they were being asked to do, as well as the level of on-device security. And many people reporting on this issue conflated the security of the device with the security of the data.

But regarding conscription: the government could have asked Apple to hand over the code, and then it could have built what it needed without Apple engineers being forced to “build the backdoor” themselves. Whereas (in theory) now the actual backdoor/vulnerability is closed and so that would not be possible. And, secondly, Apple was already (by choice) “working with the FBI to try to unlock the phone.” They had already crossed the “hack our own users” line that Cook mentioned.


’m saying that Apple misrepresented what they were being asked to do

You're taking a much too narrow and technical view of this -- as is that commenter at Hacker News. A master key isn't just the technical ability to get into the phone, it's the ability to have that key usable and enduring. If they concede to the FBI and open the phone, the FBI will expect them to be able to do it going forward. It's the difference between having a broken window that someone might get into, but that you're going to fix as soon as possible, and having a door locked with a key that can be opened repeatedly.


@Totalitat I agree that there’s a slippery slope aspect to the FBI’s request, but I don’t see what that has to do with what I wrote. It’s as if there are millions of doors with locks that have already been set to open with the same key. And Apple already has the specifications for the key. But they’re saying that the problem is not the locks or the specs but the act of printing the key. Deciding not to do so in this instance is not making the doors any more secure. Apple could still be compelled to print it or surrender the specs at a future date.

And they had already changed to non-mastered locks with the iPhone 5s.


It’s as if there are millions of doors with locks that have already been set to open with the same key

You're missing my point. You -- and the commentator -- are critiquing Apple for lying about whether there was a key or not, and I'm saying that a key is something deliberately created to be used over and over again. What Apple had was something that had not been created deliberately to access but was in essence a leftover broken window from an earlier era that Apple aimed to close as soon as possible. What they were saying to the public and the FBI was that they were not going to deliberately create and sustain a way to access locked phones.

And Apple already has the specifications for the key. But they’re saying that the problem is not the locks or the specs but the act of printing the key.

No, they're saying that this is not a door and so it was never supposed to have a key. It was a broken window and it should be fixed so that people couldn't get in it.

Apple could still be compelled to print it or surrender the specs at a future date.

Er, yes, that was my point.


The FBI was asking for no more than what Apple could already do, and it was letting Apple control the whole process. The problem was that what Apple could already do disagreed with what Apple told its customers that it could do.

Bingo. Real headline should be, "Apple disappointed it couldn't get a judge to certify that it could have its cake, and eat it, too."

I'm British so, with the greatest respect to those making a big thing of the constitutional relevance of this case in the US, it's all bollocks. We've got quite enough trouble as it is on this dank little island without some megacorporation wanting to secure for itself the exclusive right to decide whether or not a device is trustworthy on the basis of a judicial order from across the Atlantic instead of, as one might hope, the actual technical safeguards employed by the device. Sorry about that.


[…] The Time Tim Cook Stood His Ground Against the FBI […]

Leave a Comment