Monday, August 15, 2016 [Tweets] [Favorites]

Apple Security

Seth Weintraub:

Apple hasn’t often made appearances at the Black Hat hacker conference, but this year Cupertino is Thinking Different™ about security. Head of Apple security, Ivan Krstic, today said the company would pay huge (up to $200K) bug bounties to invited researchers who find and report vulnerabilities in certain Apple software.

Kate Conger (via Hacker News):

In the past, Apple has cited high bids from governments and black markets as one reason not to get into the bounty business. The reasoning went: If you’re going to be outbid by another buyer, why bother bidding at all? While $200,000 is certainly a sizable reward — one of the highest offered in corporate bug bounty programs — it won’t beat the payouts researchers can earn from law enforcement or the black market. The FBI reportedly paid nearly $1 million for the exploit it used to break into an iPhone used by Syed Farook, one of the individuals involved in the San Bernardino shooting last December.

A bug bounty program is unlikely to tempt any hackers who are only interested in getting a massive payout. For those who only care about cash, Mogull said Apple could probably never pay enough. But for those who care about making an impact, getting a check from Apple could make all the difference. “This is about incentivizing the good work,” Mogull explained.

John Gruber:

Both the bounty program and the mere fact that Krstic was speaking at Black Hat are signs of Apple’s thawing relationship with the security industry.

Ivan Krstić (tweet):

Each SEP [Secure Enclave Processor] has reference access to a unique private key (UID)

UID generated by SEP itself immediately after fabrication, using its own free-running oscillator TRNG

Available for cryptographic operations via commands exposed by the Secure ROM

No access to UID key material from SEP or other mutable software after fuses blown

[…]

Production devices can be “demoted” to enable some debugging features like JTAG and loading development software on the AP (but not the SEP)

Requires full OS erase and device explicitly authorized by the personalization server

Forces a different UID on the SEP, no access to existing user data after demotion

Matthew Green (Hacker News):

A few years ago Apple quietly introduced a new service called iCloud Keychain. This service is designed to allow you to back up your passwords and secret keys to the cloud. Now, if backing up your sensitive passwords gives you the willies, you aren’t crazy. Since these probably include things like bank and email passwords, you really want these to be kept extremely secure.

[…]

So Apple finds itself in a situation where they can’t trust the user to pick a strong password. They can’t trust their own infrastructure. And they can’t trust themselves. That’s a problem. Fundamentally, computer security requires some degree of trust -- someone has to be reliable somewhere.

Apple’s solution is clever: they decided to make something more trustworthy than themselves. To create a new trust anchor, Apple purchased a bunch of fancy devices called Hardware Security Modules, or HSMs. These are sophisticated, tamper-resistant specialized computers that store and operate with cryptographic keys, while preventing even malicious users from extracting them. The high-end HSMs Apple uses also allow the owner to include custom programming.

[…]

Note that on HSMs like the one Apple is using, the code signing keys live on a special set of admin smartcards. To remove these keys as a concern, once Apple is done programming the HSM, they run these cards through a process that they call a “physical one-way hash function”. […] So, with the code signing keys destroyed, updating the HSM to allow nefarious actions should not be possible. Pretty much the only action Apple can take is to wipe the HSM, which would destroy the HSM’s RSA secret keys and thus all of the encrypted records it’s responsible for. […] The downside for Apple, of course, is that there had better not be a bug in any of their programming. Because right now there’s nothing they can do to fix it -- except to wipe all of their HSMs and start over.

Update (2016-08-17): Here’s the video of Krstić’s talk.

Update (2016-09-20): Bruce Schneier:

Ever since Ivan Krstić, Apple’s Head of Security Engineering and Architecture, presented the company’s key backup technology at Black Hat 2016, people have been pointing to it as evidence that the company can create a secure backdoor for law enforcement.

It’s not.

Comments

Stay up-to-date by subscribing to the Comments RSS Feed for this post.

Leave a Comment