When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
This is really important. Apple cannot decrypt locked phones, so the FBI wants Apple to give them a rewritten iOS that bypasses security.
I have no idea why the FBI wants in the iPhone other than to set a precedent. They already have iCloud data and text messages and computers.
I figured the FBI is just using this case as a way to get support for the public for backdoors.
Because there is nothing the data on the iPhone can tell the FBI that the FBI doesn’t know already.
What’s important to remember is that Apple gave the FBI access to everything that exists and additional advice.
Apple gave the FBI forensic advice. That fact alone makes this look even more like a backdoor fishing expedition by the FBI.
FBI Supervisory Special Agent Christopher Pluhar stated in a declaration that he was able to obtain from Apple all the data backed up to its iCloud servers from the phone. That data showed that Farook was in communication with individuals who were later killed. Significantly, Pluhar said, the most recent backup took place on Oct. 19, 2015, indicating that Farook may have intentionally disabled the backup feature.
Previously: Secure Golden Key.
If the San Bernardino gunmen had used an iPhone with the Secure Enclave, then there is little to nothing that Apple or the FBI could have done to guess the passcode. However, since the iPhone 5C lacks a Secure Enclave, nearly all of the passcode protections are implemented in software by the iOS operating system and, therefore, replaceable by a firmware update.
If you take Tim Cook’s message at face value, the ability to create such an iOS version means there already is a backdoor.
Apple seems to be evading the admission that they already hold privileged keys they can legally be compelled to use.
Essentially, the government is asking Apple to create a master key so that it can open a single phone. And once that master key is created, we’re certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security.
This is just the beginning. I’m sure Apple will get dragged through the mud by officers and politicians alike in the coming weeks, months, years. What will be remembered though is that this thrusts the privacy debate into the spotlight, and history will remember Apple as a leader in maintaining freedom when others would seek to erode it.
Specifically, Apple is not being asked to break the encryption on the iPhone in question (which is impossible — more on this in a moment), but rather to disable the functionality that wipes the memory when multiple wrong passcodes are entered in a row.
It’s important to be clear on the definition of backdoor: Cook is taking a broader one which says that any explicitly created means to circumvent security is a backdoor; the reason the distinction matters is that a more narrow definition that says a backdoor is a way to bypass encryption specifically would not apply here. In other words, Apple is not being asked to create a “secret key” for the data (which would be impossible after the fact) but rather to make it easier to brute force the passcode on the device.
I commend Apple for standing up to this, but unfortunately, I suspect they’re eventually going to lose.
This isn’t even REALLY about crypto. It’s about whether developers can be conscripted to write spyware.
This is an unprecedented, unwise, and unlawful move by the government. The Constitution does not permit the government to force companies to hack into their customers’ devices.
The actual order specifying the not-a-crypto-backdoor the court is requesting from Apple.
Preventing such an escape would be made all the harder by the fact that foreign intelligence agencies and criminal organizations alike would undoubtedly pay immense sums of money for access to such a master key. A security exploit broker called Zerodium already paid $1,000,000 for a browser-based security exploit in iOS 9 that it planned to resell to government and defense customers (see “The Million Dollar iOS Hack (Isn’t),” 3 November 2015). And that’s for something that Apple probably found and blocked in one of the updates to iOS 9. Can you imagine how much an iOS master key would sell for? And how that kind of money could corrupt people within the chain of trust for such a key? Like Tolkien’s One Ring, the power of a master key for millions of phones would be nigh-impossible to resist (unfortunately, Apple’s latest diversity report didn’t break out the number of courageous hobbits working at the company).
That’s why this is about far more than a single phone. Apple does not have the existing capability to assist the FBI. The FBI engineered a case where the perpetrators are already dead, but emotions are charged. And the law cited is under active legal debate within the federal courts.
The crux of the issue is should companies be required to build security circumvention technologies to expose their own customers? Not “assist law enforcement with existing tools,” but “build new tools.”
Reading through the order, it seems the FBI thinks that a modified version of the operating system would allow them to engage in high-speed attacks, if the 10-tries limit was removed. The request indicates they likely can’t image the device and perform all the attacks on their own super-fast computers, due to that hardware key. With a four-character passcode the device could probably be cracked in hours.
Without strong encryption, foreign courts and foreign bureaucracies will have access to information on American citizens living on American soil. Is that what you want, Mr. or Ms. Legislator?
It is my understanding, from background sources, that all devices are vulnerable.
Some may see this confrontation between Apple and the FBI as an industry vs. government dispute, but it’s far more than that. As personal technology and the internet permeate almost every aspect of wider society, the “tech industry” is indistinguishable from society as a whole. The right to defend our personal information, and the rights of companies to act on our behalf in that pursuit, are completely and inexorably tied to our rights as members of society.
Google CEO Sundar Pichai has chimed in on the escalating battle between the FBI and Apple over iPhone encryption. Describing the letter published by Apple’s Tim Cook as “important,” Pichai says that a judge’s order forcing Apple to assist the FBI in gaining access to the data on a terrorist’s iPhone “could be a troubling precedent.” Seeing as Google oversees the Android operating system, Pichai is a crucial voice in this debate; Android also offers encryption to safeguard personal data.
Could Pichai’s response be any more lukewarm? He’s not really taking a stand, and the things he’s posing as questions aren’t actually in question. I’m glad he chimed in at all, and that he seems to be leaning toward Apple’s side, but this could be a lot stronger.
But in addition to whatever risks government access poses, there is a subtle but crucial point that is often overlooked: The kinds of security architectures in which it is easy to insert a back door are typically less secure than the security architectures in which it is hard to insert a back door.
The point is that the FBI is asking Apple to crack its own safe, it doesn’t matter how good the locks are if you modify them to be weak after installing them. And once the precedent is set then the opportunity is there for similar requests to be made of all billion or so active iOS devices. Hence the importance of this fight for Apple.
This is the most important tech case in a decade.
I am convinced that Apple is doing the morally correct thing here, by fighting the court order. I’ll bet most of you reading this agree. But like Thompson, I’m not sure at all Apple is doing the right thing politically.
I have always admired Tim Cook for his stance on privacy and Apple’s efforts to protect user data and couldn’t agree more with everything said in their Customer Letter today.
Update (2016-02-18): Will Strafach:
On a technical level, Apple could carry out the order by creating a RAM disk signed by the company’s production certificate for the specific ECID of the suspect’s iPhone. This solution would allow Apple to use existing technologies in the firmware file format to grant access to the phone ensuring that there is no possible way the same solution would work on another device.
The aspect that would actually affect the public is the fact that by doing this, Apple will show that breaking into an iPhone is “possible,” and allow the FBI to use this case in the future as leverage.
This is only possible on a technical level in very specific circumstances, but if Apple assists in this instance, then it paves the way for more unreasonable and technically difficult requests to be made. In those scenarios, it will be on Apple to try to explain why it cannot accommodate the new requests.
To allow restoration to a custom firmware, Apple would need to either: (a) make changes to the way its restore server works for this specific case, potentially causing major security concerns if any sort of mistake is made (which could make this an unreasonable / burdensome request, or (b) bring the device onto its internal network and load the firmware using the restore server used internally, since it can be assumed that such an in-house server exists for the purpose of restoring to unreleased firmware versions.
Lost in the noise today is this terrifying detail: Apple can update the Secure Enclave without wiping the data on it.
Here’s why the totality of what we know right now leans in favor of Cook and his slippery slope argument.
The issue is not Apple’s. It is not even the FBI’s. The issue is that as often happens, technology speeds past our ability to adapt or create new laws that match the onslaught of daily technological change. Typically, I am for fewer laws rather than more, but I’m also pragmatic. We should be asking our lawmakers to enact a law that fits the need of this situation and situations like this so rather than being on an eternally slippery slope of privacy violations hidden behind the All Writs Act, we have a law that will truly limit the circumstances where companies like Apple can be compelled to help a government agency crack a device.
Rogers’ claims about Paris contradict the information that came out of France following the attacks. There were claims by former US intelligence officials that encrypted communications had been used by the Islamic State affiliated terrorists in the immediate wake of the attacks. But those claims were largely dismissed by French authorities when they looked at the actual communications on devices recovered from the group. According to statements from French law enforcement, the attackers had used standard SMS messages to communicate—not encrypted messaging apps on smartphones.
The problem is that you’re talking about a multinational organisation attempting to run a global business then needing to get picky about who can exploit weaknesses deliberately designed into the product. It’s a very slippery slope once there is a backdoor. Without it, life is much simpler – nobody gets in – and clearly this is where Apple wants to be.
The order is pretty specific technically. This implies to me that what the FBI is asking for is technically possible, and even that Apple assisted in the wording so that the case could be about the legal issues and not the technical ones.
Today I walked by a television showing CNN. The sound was off, but I saw an aerial scene which I presume was from San Bernardino, and the words “Apple privacy vs. national security.” If that’s the framing, we lose. I would have preferred to see “National security vs. FBI access.”
Apple sent trusted engineers to try that method, the executives said, but they were unable to do it. It was then that they discovered that the Apple ID password associated with the iPhone had been changed. […] Had that password not been changed, the executives said, the government would not need to demand the company create a “backdoor” to access the iPhone[…]
Update (2016-02-20): John Gruber:
Daniel Roberts has posted a screenshot of the entire segment on China that was cut from the article.
This seems like a rather important discussion topic. It’s perplexing why the Times would choose to remove it; doubly so considering it’s a silent edit. This article apparently appeared on page A1 of the print edition today, too.
A county representative later told Reuters that FBI agents requested the iCloud password reset.
Specifically, the FBI asked someone to destroy potential evidence (the iCloud auth token) that could have been used without a new backdoor.
So, technically speaking, I think what happens next is that Apple begins to engineer phones such that they can no longer assist the FBI, even if compelled by court order. Here’s my specific bet: right now Apple can update a phone’s entire software stack to reconfigure a particular phone’s behavior, including number of PIN tries and delays – the most secure parts of the phone. I bet Apple will move towards making the most sensitive parts of that stack updatable only in very specific conditions[…]
Crucial details in the @FBI v. #Apple case are being obscured by officials. Skepticism here is fair.
What many haven’t considered is the significant difference – in the legal world – between providing lab services and developing what the courts will consider an instrument.
If evidence from a device ever leads to a case in a court room, the defense attorney will (and should) request a copy of the tool to have independent third party verification performed, at which point the software will need to be made to work on another set of test devices. Apple will need to work with defense experts to instruct them on how to use the tool to provide predictable and consistent results.
Update (2016-02-22): John Gruber:
The only possible explanations for this are incompetence or dishonesty on the part of the FBI. Incompetence, if they didn’t realize that resetting the Apple ID password could prevent the iPhone from backing up to iCloud. Dishonesty, if they directed the county to do this knowing the repercussions, with the goal of setting up this fight to force Apple to create a back door for them in iOS. I’m not sure which to believe at this point. I’d like to know exactly when this directive to reset the Apple ID password was given — ” in the hours after the attack” leaves a lot of wiggle room.
When the consumer version of the Internet of Things (IoT) is broadly deployed on smart connected objects everywhere, we’ll see an astounding flow of private data. Naturally, conscientious suppliers will protect these most intimate details of our daily lives with encryption…but government busybodies will want to know if we’re following the Law as we fish, swim, eat, smoke, multiply, and die.
The San Bernardino litigation isn’t about trying to set a precedent or send any kind of message. It is about the victims and justice. Fourteen people were slaughtered and many more had their lives and bodies ruined. We owe them a thorough and professional investigation under law.
Apple has posted a FAQ:
Yes, it is certainly possible to create an entirely new operating system to undermine our security features as the government wants. But it’s something we believe is too dangerous to do. The only way to guarantee that such a powerful tool isn’t abused and doesn’t fall into the wrong hands is to never create it.
I agree, but I wish Apple wouldn’t exaggerate with language like “entirely new operating system.”
The FBI has made statements both in their court filings and in the press which are simply untrue. If it weren’t for the fact that the people making these claims are actual forensics experts (or work with such experts), I’d be inclined to say that they just don’t know what they’re talking about. Given that they do work for the FBI, I think it’s reasonable to hold them to a much higher standard of clueful-ness.
Apple is dangerously muddling the encryption debate by claiming a backdoor doesn’t exist until a proof-of-concept exploit is written.
It confuses me how much uptake this has had. Lots of people repeating it. The backdoor is there now….
Update (2016-02-23): Ben Thompson:
The first thing to understand about the issue at hand is that there are three separate debates going on: the issue at hand, the encryption debate, and the PR battle. To understand the issue it is necessary to separate them, but to figure out which side may win it is equally critical to understand how they relate to each other.
The U.S. Department of Justice is pursuing additional court orders that would force Apple to help federal investigators extract data from twelve other encrypted iPhones that may contain crime-related evidence, according to The Wall Street Journal.
Update (2016-02-28): Ted Goranson:
The FBI’s request signals a new attempt at control. This is an organization that usually delivers its requests under seal, meaning they remain secret. In this case, the FBI took the unusual step of making its demand public, so as to push the issue into the press by hitting the hot button of “terrorism.” The FBI’s intent, it seems, is to prompt lawmakers to respond to public outrage.
Most legal arguments against “unlocking” this particular phone cite the constitutionally protected right to free speech. But a better parallel is the right to gun ownership in the US and the constitutional provisions that support it. In the past, technology in the US was primarily developed in a military context and justified in the service of war, but also to maintain civil order. At the end of the eighteenth century, the apotheosis of that technology was the firearm.
Today’s most powerful technologies relate to information concerning our thoughts, associations, and bodies. If the framers of the US Constitution were alive today and as well educated as they were then, the Bill of Rights would likely be focused on balancing access to information to ensure that the government remains within bounds.
I can’t really believe that the FBI would abuse the White House’s trust by launching an unauthorized expedition to force Apple to backdoor the iPhone. To the contrary, I think this operation was vetted at the highest level – but not for its apparent purpose.
Just this morning, the NY Times tells us that a meeting last month between White House staff and tech executives ended on a sour note. (Such stories are a part of a long tradition of “authorized” disclosures, articles based on trusted relationships between insider sources and writers.) Apparently, Chief of Staff Denis R. McDonough took exception to Cook’s reproaching the White House for “lacking leadership” on the encryption issue, and reportedly called Cook’s statements a “rant”.
Update (2016-03-11): Cyrus Farivar:
As expected, federal prosecutors filed their formal response on Thursday in the ongoing case involving the seized iPhone 5C that was used by one of the shooters in the San Bernardino terrorist attack in December 2015.
Apple was surprised last month when the DOJ decided to fight this in public. But until today, the tone has been civil. Adversarial, clearly — there is no middle ground. But civil. Today, though, the DOJ made things nasty. I think Apple was genuinely surprised by the threatening tone and nature of today’s brief.
“The evidence on Farook’s iCloud account suggests that he had already changed his iCloud password himself on October 22, 2015—shortly after the last backup—and that the autobackup feature was disabled. A forced backup of Farook’s iPhone was never going to be successful, and the decision to obtain whatever iCloud evidence was immediately available via the password change was the reasoned decision of experienced FBI agents investigating a deadly terrorist conspiracy,” the government claims.
Update (2016-03-23): John Gruber:
I get the feeling the FBI concluded they were going to lose, so they’re not even going to test it.
Israeli mobile software developer Cellebrite is helping the FBI in its attempt to unlock the iPhone at the center of the San Bernardino shooter investigation.
On Monday, the U.S. Justice Department convinced the court overseeing its ongoing battle with Apple to postpone a hearing scheduled to take place March 22. The DoJ said new leads had been discovered that could provide it with a way to unlock the iPhone 5c used by San Bernardino shooter Syed Farook without involving Apple.
That was nearly twelve years ago. The person in charge of sales at that time, Tim Cook. I am sure that given what I have seen since then, I would never keep any personal files on a company computer.
So please pardon me if I am a little skeptical of Apple’s interest in preserving the privacy of anyone. The sincerest held belief at Apple is making money from selling products. That’s is fine, I have nothing against Apple making money because they make some very good products and I use one daily. However, let us not put Apple on a security pedestal without asking Tim if he would try to break into an iPhone if they thought someone at Apple was saying something the company didn’t want said.
Update (2016-03-25): Bruce Schneier:
My guess is that it’s not [Cellebrite]. They have an existing and ongoing relationship with the FBI. If they could crack the phone, they would have done it months ago.
Cellebrite, the mobile forensics company reportedly assisting the FBI to extract data from the iPhone in the San Bernardino case, has written a white paper noting that extracting the data is only part of the challenge. If law enforcement agencies are to be able to obtain convictions on the basis of that data, there are a lot of questions that have to be answered.
Update (2016-03-30): David Sparks:
The best case scenario at the legislative end would be for a law to be passed restricting access and prohibiting the government from requiring backdoors in cellular phones. Let’s just say I’m not holding my breath for that one. In my opinion if there is going to be a law passed, it’s going to be a law requiring installation of a backdoor and not the opposite.
I believe our President understands all of this, that he believes unbreakable cryptography is the lesser of two bad choices…but he must weigh what he says. Can we really expect him to say that the FBI is wrong? Instead, he lets the FBI push hard, absorbs some of the reflected Law and Order sunshine, and allows the San Bernardino case to take the long, arduous road to the Supreme Court. And Backdoor legislation will be introduced, discussed and discussed, with the Tech Industry up in arms – and dollars – against it.
By then, Barack Obama will be a former President, Free At Last to say what he really thinks. I can’t wait.
The New York Times is reporting that the outside party engaged to unlock the San Bernardino terrorist’s iPhone has been successful, and the Department of Justice is withdrawing from its legal action against Apple.
Apple has issued a statement concerning the Department of Justice withdrawing its demand under the All Writs Act that the company aid in creating a version of iOS that would be faster and easier for the government to hack into.
I guarantee you at some point, somebody told Director Comey, “good god, Jim, you’ve got to make this go away.”
A battle is over, but the war has only just begun.
Update (2016-04-02): Jonathan Ździarski:
FBI: You should do it, it’s just one phone
Apple: No it isn’t
FBI: We got in
Apple: You should say how, it’s just one phone
FBI: No it isn’t
The FBI cracked a San Bernardino terrorist’s phone with the help of professional hackers who discovered and brought to the bureau at least one previously unknown software flaw, according to people familiar with the matter.
Although the method of the FBI’s entry into the San Bernardino shooter’s iPhone has been the source of many rumors, a new report from CBS News states that at this point in the process, “nothing of real significance” has been discovered within the device.
Stay up-to-date by subscribing to the Comments RSS Feed for this post.