Archive for July 19, 2014

Saturday, July 19, 2014

iOS Backdoors, Attack Points, and Surveillance Mechanisms

Jonathan Zdziarski (PDF):

Overall, the otherwise great security of iOS has been compromised…by Apple…by design.

Via Michael Yacavone.

Update (2014-07-22): Jonathan Zdziarski:

I have NOT accused Apple of working with NSA, however I suspect (based on released documents) that some of these services MAY have been used by NSA to collect data on potential targets. I am not suggesting some grand conspiracy; there are, however, some services running in iOS that shouldn’t be there, that were intentionally added by Apple as part of the firmware, and that bypass backup encryption while copying more of your personal data than ever should come off the phone for the average consumer. I think at the very least, this warrants an explanation and disclosure to the some 600 million customers out there running iOS devices. At the same time, this is NOT a zero day and NOT some widespread security emergency. My paranoia level is tweaked, but not going crazy. My hope is that Apple will correct the problem. Nothing less, nothing more. I want these services off my phone. They don’t belong there.

Dan Goodin:

Zdziarski said the service that raises the most concern is known as com.apple.mobile.file_relay. It dishes out a staggering amount of data—including account data for e-mail, Twitter, iCloud, and other services, a full copy of the address book including deleted entries, the user cache folder, logs of geographic positions, and a complete dump of the user photo album—all without requiring a backup password to be entered. He said two other services dubbed com.apple.pcapd and com.apple.mobile.house_arrest may have legitimate uses for app developers or support people but can also be used to spy on users by government agencies or even jilted ex-lovers. The Pcapd service, for instance, allows people to wirelessly monitor all network traffic traveling into and out of the device, even when it’s not running in a special developer or support mode. House_arrest, meanwhile, allows the copying of sensitive files and documents from Twitter, Facebook, and many other applications.

[…]

Zdziarski said the services aren’t easy for anyone to abuse, making it unlikely that hackers could exploit them on a wide scale. Still, he said the functions are within easy reach of technically knowledgeable people who have access to a computer, electric charger, or other device that has ever been modified to digitally pair with a targeted iPhone or iPad. During the pairing process, iDevices create a file containing a set of digital keys. Anyone with access to such files can make almost unfettered use of the services, often wirelessly, until the iPhone or iPad undergoes a factory reset.

Dan Moren:

The company also reiterated its stance that it doesn’t compromise its systems for the purpose of providing those access points to the authorities: “As we have said before, Apple has never worked with any government agency from any country to create a backdoor in any of our products or services.”

While such statements may be intended to assuage fears over the privacy implications of these systems, they’re hard to classify as categorical denials in this case. For one thing, Apple hasn’t yet explained why anybody needs the breadth of information that these tools seem to provide access to, nor why these services, if indeed for diagnostic use, are not presented for users to opt into. In the case of enterprise environments where devices are provided by a company, users are generally made aware of the access that IT departments have to their devices. But when we’re talking about the general public, no such warning is given—nor should it be needed.

[…]

Apple has taken a firm stand on privacy, and it’s disappointing to see the company not fully and transparently explaining why these systems have the range of access that they do, why they circumvent security processes the company itself put into place, and why there’s no way for a user to easily disable them. That’s the kind of attitude that we’ve grown to expect from the company, and we’d like to see them live up to it.

Update (2014-07-23): Jonathan Zdziarski:

In a response from Apple PR to journalists about my HOPE/X talk, it looks like Apple might have inadvertently admitted that, in the classic sense of the word, they do indeed have back doors in iOS, however claim that the purpose is for “diagnostics” and “enterprise”.

[…]

The problem with this is that these services dish out data (and bypass backup encryption) regardless of whether or not “Send Diagnostic Data to Apple” is turned on or off, and whether or not the device is managed by an enterprise policy of any kind. So if these services were intended for such purposes, you’d think they’d only work if the device was managed/supervised or if the user had enabled diagnostic mode. Unfortunately this isn’t the case and there is no way to disable these mechanisms. As a result, every single device has these features enabled and there’s no way to turn them off, nor are users prompted for consent to send this kind of personal data off the device.

[…]

Obviously, Apple realized that pairing in and of itself offered very little security, as they added backup encryption to all backups as a feature – something that also requires pairing to perform. So Apple doesn’t trust pairing as a “security” solution either. And for good reason: it wasn’t designed to be secure. It is not two factor; it is not encrypted with a user paraphrase; it is simply “something you have” that gives you complete unfettered access to the phone. And it can be had as easily as copying one file, or created on the fly via USB. It can be used if law enforcement seizes your computer; it can be stolen by someone hacking in; it is by all means insecure. But even with the pairing record, I would have expected the data that comes off my device to be encrypted with the backup password, right? These services completely bypass this.

Apple responds on its site, but doesn’t really address what people are concerned about (via Cabel Sasser).

Update (2014-07-24): Dan Goodin:

The episode is a good example of the way Apple’s trademark secrecy can come back to bite the company. Apple may have legitimate reasons for folding these services into iOS, even when it isn’t running in special diagnostic or support modes. But the company never took the time to disclose these services or to respond to Zdziarski’s private entreaties to executives until the undocumented functions became an international news story. Zdziarski’s larger point seems to be that the services he brought to light represent vectors that ex-lovers, housemates, co-workers and, yes, spy agencies can exploit to bypass cryptographic protections designed to prevent sensitive data from being accessed by unauthorized parties. Until last weekend, that point was only implicit. It has now been made explicit.

Wickwick:

Example: You’re dating someone who uses your laptop when she visits. Unbeknownst to you, she emails herself the keys from your laptop.

You break up. She visits the same Starbucks as you. While you’re reading emails, whatever she uses the wifi to turn on File relay and copies everything off your phone. There’s no alert, there’s no dialog box. Your phone just starts dumping information.

All an adversary needs is temporary access to a single trusted device once and they have the keys to the kingdom forever.

joosters:

What’s really disappointing is that there seems to be an all-or-nothing security model here. If I pair my phone with a computer, then suddenly it has complete access to spy on me, install monitoring tools that can continue to run, etc. Why can’t there be a way where I can transfer music/photos to/from my phone without providing this full device access?

You’d be pretty annoyed if the front door to your house, when you opened it, also opened up your document safe, emptied your wallet onto the floor and invited visitors to leave bugging devices to spy on you later.

Also, the defence of “just don’t agree to pair your phone with an unknown USB device” can actually be tricky. On a flight, I plugged my phone into the USB port on the seatback to charge it. The phone repeatedly kept asking if I wanted to pair it with something (who knows what it was? the entertainment system, maybe?). If I had accidentally hit the wrong button only once (on a prompt that randomly appeared), my phone could have been owned, and there’s no easy way to un-pair.

Update (2014-07-28): Jonathan Zdziarski:

In iOS, pcapd is available on every iOS device out there, and can be activated on any device without the user’s knowledge. You also don’t have to be enrolled in an enterprise policy, and you don’t have to be in developer mode. What makes this service dangerous is that it can be activated wirelessly, and does not ask the user for permission to activate it… so it can be employed for snooping by third parties in a privileged position.

[…]

Apple is being completely misleading by claiming that file relay is only for copying diagnostic data. If, by diagnostic data, you mean the user’s complete photo album, their SMS, Notes, Address Book, GeoLocation data, screenshots of the last thing they were looking at, and a ton of other personal data – then sure… but this data is far too personal in nature to ever be needed for diagnostics.

[…]

Additionally, this claim that your data is respected with data-protection encryption. The pairing record that is used to access all of this data is sent an escrow bag, which contains a backup copy of your key bag keys for unlocking data protection encryption. So again, we’re back to the fact that with any valid pairing, you have access to all of this personal data – whether it was Apple’s intention or not.

Now I hear the argument pop up from a few people who don’t understand how all of this works that, “of course you can dump personal info after you’ve paired, it’s supposed to sync your data”. Well, no. The trust dialog (the only pairing security there is) was only an afterthought that got added last year after another researcher showed how easily you could hijack iOS 6 by simply plugging it into a malicious charger. In fact, Apple added backup encryption to iOS specifically because they realized people’s devices were pairing with a bunch of hardware that the user didn’t trust. If pairing were meant to be a means to security, there would be no need for backup encryption at all.

[…]

In addition to downplaying the services themselves, Apple has stated that the user must “explicitly grant consent” for these services to be used. This is not the case. The user has had no idea these services even exist at all on the device until recently. There is no dialog asking the user to allow the packet sniffer to run, or to access your photos/contacts/sms/etc to provide to AppleCare (the dialogs you’re used to seeing third party apps present are not presented when these services are accessed). This consent simply doesn’t exist. The only consent is pushing that “trust” button, which (unbeknownst to the user) gives complete carte blanche access to the mobile device, wirelessly, indefinitely, and bypassing the backup encryption that the user believes is protecting their data from unwanted eyes.

Jonathan Zdziarski:

In spite of my warnings to the media (via email and telephone inquiries) not to pitch this as a conspiracy theory, they have still managed to completely derail the original intention of this research, and so I think a quick proof-of-concept video will help to clear up any misunderstandings about what this technique can and can’t do. I’ve also outlined the threat models that will and won’t work for this attack.

A Modest Proposal: C++ Resyntaxed

Ben Werther & Damian Conway (2002):

We describe an alternative syntactic binding for C++. This new binding includes a completely redesigned declaration/definition syntax for types, functions and objects, a simplified template syntax, and changes to several problematic operators and control structures. The resulting syntax is LALR(1) parsable and provides better consistency in the specification of similar constructs, better syntactic differentiation of dissimilar constructs, and greater overall readability of code.

Google, Roboto and Design PR

Khoi Vinh:

I don’t point this out to mock or criticize the author’s errors or misconceptions about what goes into designing typefaces, but rather in fact to marvel at how well Google is selling the story of its design efforts.

There’s essentially no news in this article other than, “Google has revised Roboto using some recent best practices of type design.” And yet the Mountain View company has been able to spin that non-story into a story that claims that the company is fundamentally reinventing typography.

Building assert() in Swift

Apple:

When designing Swift we made a key decision to do away with the C preprocessor, eliminating bugs and making code much easier to understand. This is a big win for developers, but it also means Swift needs to implement some old features in new ways. Most of these features are obvious (importing modules, conditional compilation), but perhaps the most interesting one is how Swift supports macros like assert().

@auto_closure is great, but I still wish for the power of actual macros.

Update (2015-02-11): Part 2 introduces __FILE__ and __LINE__. Swift does not have a way of capturing the expression as a string, among other limitations from not having macros.

Exploring Swift Memory Layout

Mike Ash:

It looks similar to an Objective-C object, but there's 16 bytes of metadata instead of just 8 as is the case for Objective-C. It turns out that it is an Objective-C object that can be inspected using the APIs in objc/runtime.h.

Joe:

The other 32-bit value in the reference count is the unowned reference count, which is incremented by unowned references. The object is destroyed when its strong reference count hits zero (which also decrements the unowned reference count), and deallocated when its unowned reference count hits zero. This allows the runtime to verify that unowned references don't dangle, while avoiding the full cost of a nil-ing weak reference.

Still, it seems like lot of memory used in each instance for reference counting, when most objects won’t have very high reference counts.

Update (2014-08-01): Mike Ash:

Swift memory layout for generics is completely straightforward. The values are laid out just as they are with non-generic types, even if it means that class instances change size. Optionals of reference types represent nil as all zeroes, just like we're used to in Objective-C. Optionals of value types append a byte to the end of the value to indicate whether or not a value is present. Protocol types take up 40 bytes of storage, with the last two pointers containing references to type metadata tables, and the rest available for storying the underlying value. If more than 24 bytes of storage is needed, the value is automatically placed on the heap, and a pointer to the allocation is stored instead.