Monday, March 7, 2016

Federighi and Cryptographers on FBI vs. Apple

Craig Federighi (via Tim Hardwick):

Your phone is more than a personal device. In today’s mobile, networked world, it’s part of the security perimeter that protects your family and co-workers. Our nation’s vital infrastructure — such as power grids and transportation hubs — becomes more vulnerable when individual devices get hacked. Criminals and terrorists who want to infiltrate systems and disrupt sensitive networks may start their attacks through access to just one person’s smartphone.

[…]

That’s why it’s so disappointing that the FBI, Justice Department and others in law enforcement are pressing us to turn back the clock to a less-secure time and less-secure technologies. They have suggested that the safeguards of iOS 7 were good enough and that we should simply go back to the security standards of 2013. But the security of iOS 7, while cutting-edge at the time, has since been breached by hackers. What’s worse, some of their methods have been productized and are now available for sale to attackers who are less skilled but often more malicious.

I don’t understand what this second part is referring to. It doesn’t sound like what we were talking about before.

Paul Wagenseil (via Hacker News):

“I don’t think this case is about backdoors,” said Adi Shamir, who with his MIT colleagues Leonard Adleman and Ron Rivest developed the RSA encryption algorithm in 1977. “The FBI is asking Apple to do something very specific. It’s got nothing to do with placing backdoors in millions of phones around the world.”

Martin Hellman, who developed the Diffie-Hellman encryption-key exchange with Whitfield Diffie at Stanford in 1976, disagreed, as did Rivest and Diffie.

[…]

“[Apple] put themselves in a position where they could state they could no longer help,” [Shamir] added. “But they failed because they didn’t close this particular loophole in which Apple can help the FBI. Apple should close this loophole, and then they can really make the argument.”

Indeed, the backdoor is already there in that current phones will accept software updates signed by Apple, without wiping the user data. So, in theory, the FBI could simply compel Apple to hand over its signing key and then build itself the tool that it wants. The line of argument about government conscripting Apple engineers to do custom software development is a red herring.

Likewise, it makes sense to worry about creating a special OS build—because what if it got out? But we face the same situation today if Apple’s key somehow got out. No one seems to be talking about that happening, even though the difference is just a matter of some engineering.

This will all get a lot more interesting when Apple makes a phone that’s secure from Apple itself.

Blake Ross:

Governments decided that allowing crew members to fully override the flying pilot using a key code would be insecure, since it would be too easy for that code to leak. Thus, there is nothing the outside pilot can do — whether electronically or violently — to open the door if the flying pilot is both conscious and malicious.

[…]

What’s striking is that this incident did not prompt any change in cockpit protocol in the United States. The FAA is improving mental health checks, but at 30,000 feet, we still have a security system where the parameters are widely known to criminals; where the method of abuse is clear; where we see no way for people outside the cockpit to stop it; and we’ve still decided the public is best served by keeping the people in the cockpit in charge of the lock.

[…]

Unbreakable phones are coming. We’ll have to decide who controls the cockpit: The captain? Or the cabin?

Update (2016-03-11): Christopher Soghoian (via John Gruber):

DOJ: We tried to be nice. We could just force Apple to turn over the iOS source code and code signing keys.

6 Comments RSS · Twitter

I have read a few statements by law enforcement about how great everything was in iOS 7, that no "legitimate" user needs stronger protection that that, and that the increased security of iOS8 and later only benefits pedophiles and terrorists. That is the mentality Craig is addressing in your second quoted paragraph.

"The line of argument about government conscripting Apple engineers to do custom software development is a red herring."

It's clear you haven't been following the case, that is the entire legal issue at stake here and what Judge Orenstein ruled on.

@Steve Thank you. So Federighi was responding to general statements, rather than something in the order itself?

@Barry I saw Ornstein’s ruling, which was for a different case. Also:

The New York magistrate echoed at least one of the key arguments Apple is making in its other, higher-profile fight with the government: the All Writs Act (AWA) can’t be used to order a technology company to manipulate its products, he said.

My point is that it they don’t really need Apple to manipulate their products. They just need it to give them the key.

You can't be compelled to sign things. See the EFF's amicus brief RE code as free speech and the case law against compelled speech. Fascinating stuff!

@deeje Interesting, indeed. But can you be forced to surrender your key so that others can sign things with it?

The more I read, the more I'm convinced the greatest threats lie behind the word "just" or "simply"

Leave a Comment