Archive for August 15, 2016

Monday, August 15, 2016

Apple Security

Seth Weintraub:

Apple hasn’t often made appearances at the Black Hat hacker conference, but this year Cupertino is Thinking Different™ about security. Head of Apple security, Ivan Krstic, today said the company would pay huge (up to $200K) bug bounties to invited researchers who find and report vulnerabilities in certain Apple software.

Kate Conger (via Hacker News):

In the past, Apple has cited high bids from governments and black markets as one reason not to get into the bounty business. The reasoning went: If you’re going to be outbid by another buyer, why bother bidding at all? While $200,000 is certainly a sizable reward — one of the highest offered in corporate bug bounty programs — it won’t beat the payouts researchers can earn from law enforcement or the black market. The FBI reportedly paid nearly $1 million for the exploit it used to break into an iPhone used by Syed Farook, one of the individuals involved in the San Bernardino shooting last December.

A bug bounty program is unlikely to tempt any hackers who are only interested in getting a massive payout. For those who only care about cash, Mogull said Apple could probably never pay enough. But for those who care about making an impact, getting a check from Apple could make all the difference. “This is about incentivizing the good work,” Mogull explained.

John Gruber:

Both the bounty program and the mere fact that Krstic was speaking at Black Hat are signs of Apple’s thawing relationship with the security industry.

Ivan Krstić (tweet):

Each SEP [Secure Enclave Processor] has reference access to a unique private key (UID)

UID generated by SEP itself immediately after fabrication, using its own free-running oscillator TRNG

Available for cryptographic operations via commands exposed by the Secure ROM

No access to UID key material from SEP or other mutable software after fuses blown

[…]

Production devices can be “demoted” to enable some debugging features like JTAG and loading development software on the AP (but not the SEP)

Requires full OS erase and device explicitly authorized by the personalization server

Forces a different UID on the SEP, no access to existing user data after demotion

Matthew Green (Hacker News):

A few years ago Apple quietly introduced a new service called iCloud Keychain. This service is designed to allow you to back up your passwords and secret keys to the cloud. Now, if backing up your sensitive passwords gives you the willies, you aren’t crazy. Since these probably include things like bank and email passwords, you really want these to be kept extremely secure.

[…]

So Apple finds itself in a situation where they can’t trust the user to pick a strong password. They can’t trust their own infrastructure. And they can’t trust themselves. That’s a problem. Fundamentally, computer security requires some degree of trust -- someone has to be reliable somewhere.

Apple’s solution is clever: they decided to make something more trustworthy than themselves. To create a new trust anchor, Apple purchased a bunch of fancy devices called Hardware Security Modules, or HSMs. These are sophisticated, tamper-resistant specialized computers that store and operate with cryptographic keys, while preventing even malicious users from extracting them. The high-end HSMs Apple uses also allow the owner to include custom programming.

[…]

Note that on HSMs like the one Apple is using, the code signing keys live on a special set of admin smartcards. To remove these keys as a concern, once Apple is done programming the HSM, they run these cards through a process that they call a “physical one-way hash function”. […] So, with the code signing keys destroyed, updating the HSM to allow nefarious actions should not be possible. Pretty much the only action Apple can take is to wipe the HSM, which would destroy the HSM’s RSA secret keys and thus all of the encrypted records it’s responsible for. […] The downside for Apple, of course, is that there had better not be a bug in any of their programming. Because right now there’s nothing they can do to fix it -- except to wipe all of their HSMs and start over.

Update (2016-08-17): Here’s the video of Krstić’s talk.

Update (2016-09-20): Bruce Schneier:

Ever since Ivan Krstić, Apple’s Head of Security Engineering and Architecture, presented the company’s key backup technology at Black Hat 2016, people have been pointing to it as evidence that the company can create a secure backdoor for law enforcement.

It’s not.

Why Don’t Podcasts Use VBR MP3s?

Marco Arment (Hacker News):

AVFoundation, the low-level audio/video framework in iOS and macOS, does not accurately seek within VBR MP3s, making VBR impractical to use for long files such as podcasts. Jumping to a timestamp in an hour-long VBR podcast can result in an error of over a minute, without the listener even knowing because the displayed timecode shows the expected time.

[…]

Three simple solutions to accurate VBR stream-seeking have existed for almost twenty years to embed seek-offset tables at the start of VBR MP3s for precise seeking[…] But AVFoundation supports none of them.

Update (2016-08-21): See also: Accidental Tech Podcast.

Microsoft Leaks Its Golden Key

Daniel Eran Dilger:

Microsoft has demonstrated why the FBI’s desire for “Golden Key” backdoors allowing “good guys” to bypass security is such a bad idea: it inadvertently released its own keys to Windows tablets, phones, HoloLens and other devices using UEFI Secure Boot.

Tom Mendelsohn (via Bruce Schneier):

Secure Boot works at the firmware level, and is designed only to allow an operating system signed with a key certified by Microsoft to load. It can be disabled on many desktops, but on most other Windows devices, it’s hard-coded in. The golden key policy seems to have been designed for internal debugging purposes, to allow OS signature checks to be disabled, apparently so programmers can test new builds. In practice, it could well open up Microsoft’s tablets and phones to serious attacks.

[…]

Microsoft has now responded to the Secure Boot blooper.

The company said: “The jailbreak technique described in the researchers’ report on August 10 does not apply to desktop or enterprise PC systems. It requires physical access and administrator rights to ARM and RT devices and does not compromise encryption protections.”

Matthew Garrett:

Unfortunately older versions of the boot loader will happily load a supplementary policy as if it were a full policy, ignoring the fact that it doesn’t include a device ID. The loaded policy replaces the built-in policy, so in the absence of a base policy a supplementary policy as simple as “Enable this feature” will effectively remove all other restrictions.

Unfortunately for Microsoft, such a supplementary policy leaked. Installing it as a base policy on pre-Anniversary Edition boot loaders will then allow you to disable all integrity verification, including in the boot loader. Which means you can ask the boot loader to chain to any other executable, in turn allowing you to boot a compromised copy of any operating system you want (not just Windows).

ole man:

I can still remember a time when the ability to install any software on your very own computer wasn’t considered to be a “bug” or a “vulnerability”.

Previously: FBI Asks Apple for Secure Golden Key.

1Password Cloud Services Incompatible With VPNs

Matt Henderson:

Unfortunately, however, I recently discovered that all of our 1Password applications (iOS and Mac) have stopped syncing their data with 1Password’s servers. And to make matters worse, the apps don’t provide any feedback to the user that synchronization has failed! It was only after removing a Families account from one of the devices, and trying to add it back did I finally see a “No response from server” error.

[…]

Right now, because so few users are affected by this, 1Password’s response is just: “Sorry, you can’t use our service if you’re going to use a VPN.”

[…]

If you’re going to put your software API in front of CloudFlare, as 1Password has done, then you must also engineer a model and user experience that accounts for false positives.

Dave Teare:

Adding a CAPTCHA like is certainly an option and we may take that route. We need to keep in mind that we’d need to do this on all the client apps as well, so it’s not a trivial change. Hopefully we can get there.

Why Night Shift May Seem a Little Half-hearted

Ben Lovejoy:

When one German customer emailed Apple’s SVP of software engineering Craig Federighi to suggest that Night Shift had more blue light than Flux, he received a reply explaining that there was good reason for this[…]

Craig Federighi:

Given the display technology we push it as far as we can without introducing major red ghosting artifacts when scrolling / animating. (Unfortunately, the red phosphors in the LCD hold their color longer and when we shift the display too far into the red then scrolling results in irritating ghosting artifacts).

Update (2016-08-16): etendue:

Federighi’s response is nonsensical: LCDs don’t use phosphors in the color filter plane.