Tuesday, May 19, 2020

FBI Unlocks Pensacola Phone

Joe Rossignol:

FBI officials have somehow managed to unlock at least one of two passcode-protected iPhones owned by Mohammed Saeed Alshamrani, the perpetrator of a mass shooting at a Naval Air Station in Florida last December, according to CNN.

Apple provided the FBI with iCloud data belonging to Alshamrani, but it refused to assist investigators with gaining access to the iPhones.

Malcolm Owen:

Though the unlock method wasn’t revealed, the fact that the FBI has been able to gain access to evidence would usually be thought to slightly reduce the pressure applied by the US government and law enforcement agencies upon Apple to provide more assistance beyond what is already offered by the iPhone maker. To US Attorney General William Barr, the press conference was an opportunity to try and increase that pressure.

Apple:

Apple responded to the FBI’s first requests for information just hours after the attack on December 6, 2019 and continued to support law enforcement during their investigation. We provided every piece of information available to us, including iCloud backups, account information and transactional data for multiple accounts, and we lent continuous and ongoing technical and investigative support to FBI offices in Jacksonville, Pensacola and New York over the months since.

[…]

It is because we take our responsibility to national security so seriously that we do not believe in the creation of a backdoor — one which will make every device vulnerable to bad actors who threaten our national security and the data security of our customers. There is no such thing as a backdoor just for the good guys, and the American people do not have to choose between weakening encryption and effective investigations.

I’m trying to figure out what the last clause means. It seems like Apple is saying that it’s good that there was a security flaw that the FBI was able to exploit. This seems to let everyone have their cake and eat it, too. We get strong encryption, and the FBI gets the information it wants. But, if Apple ever fixes all the flaws, then there will be a hard choice between weakening encryption for all and impeding investigations. And, in the meantime, the strong encryption carries a huge asterisk because the government seems to be able to get into every high-profile phone, and there are tools for sale that let others do so as well.

Previously:

Update (2020-05-20): Kevin Collier and Cyrus Farivar:

The FBI was able to eventually access Alshamrani’s phone not by an unprecedented technical feat, but rather by “an automated passcode guesser,” according to a person familiar with the situation who spoke on condition of anonymity because the person was not authorized to speak publicly on the matter.

Via John Gruber:

So you can see why the FBI and DOJ are still pressuring Apple to build backdoors into devices — if the Pensacola shooter had used a decent alphanumeric passphrase it’s very unlikely they’d have been able to get into his iPhone.

On the other hand, law enforcement benefits greatly from the fact that the default iOS passcode remains only 6 numeric digits.

5 Comments RSS · Twitter

Niall O'Mara

The weird thing is I never hear people say extremely strong physical doors should be done away with - even though criminals take advantage of them to try and keep law enforcement at bay.

Apple's position of not unlocking iPhones is ostensibly good and I agree that they shouldn't, but their public arguments for it don't seem logically sound.

The flaws and exploits used to unlock the (often outdated) phones demonstrably exist, the "backdoors" are therefore already present and available to "good guys" and "bad guys" alike from other parties. Apple obviously knows about most of these flaws and has the same or greater capability to break them, but puts up a front of refusal as though the phones were not already open books to black hats, even as they provide every requested bit of users' iCloud information off their "strong hardware and software protected" data centers.

It's not clear where Apple's lines are drawn, and why. They're fine with giving away everything else about their customers, including backups of the same information as on the phones (so there's no ethical compunction), and their refusal to unlock the devices themselves provides zero further protection given that forensics firms worldwide can inevitably do the same job (so there's no technical reason).

Kevin Schumacher

> I’m trying to figure out what the last clause means.

I read it as saying that investigations can be effective with the information available via iCloud, etc. and without access to the physical device. So less that they're happy about exploits and more that they aren't going to push this standoff any further by doing properly encrypted iCloud backups. As of right now, if the user relies on things like Apple's Contacts app, as well as iCloud backups, there is a *lot* of information Apple can provide.

The weird thing is I never hear people say extremely strong physical doors should be done away with - even though criminals take advantage of them to try and keep law enforcement at bay.

Arguably, that line of argument is why suitcases come with “TSA-approved keys”. You bring one on your flight, and you give the government a backdoor to open it (while not allowing anyone else to open it).

Of course, those keys have been cracked, because such a backdoor always eventually leaks. (Not to mention a physical key obviously has extremely weak encryption.)

Ghost Quartz

> I’m trying to figure out what the last clause means

I don’t know, I think even with flawless device encryption, the choice between effective investigations and strong encryption is a false dichotomy. Investigators can do their job without breaking into people’s phones.

> They're fine with giving away everything else about their customers, including backups of the same information as on the phones

iCloud backups are neither comprehensive nor the only way of backing up the phone. I personally don’t use them. It deeply bothers me that the default/recommended backup scheme is insecure (not to mention severely lacking in storage) but let’s not pretend that you’re being forced to use iCloud.

> their refusal to unlock the devices themselves provides zero further protection given that forensics firms worldwide can inevitably do the same job

Given this, why would the FBI bother to ask Apple for that access when they can go to a forensic firm that offers that capability? What they’re really agitating for is guaranteed, built-in access to this device, and every device in the future. This is far worse than the current situation, where Apple tries its best to make devices unbreakable and forensics firms try their best to bypass those protections.

Leave a Comment