Monday, March 9, 2020

Trying to Weaken Encryption Again

The New York Times:

Attorney General William Barr and his British and Australian counterparts are set to push Facebook for a back door to its end-to-end encryption on WhatsApp and other messaging platforms, which would give investigators access to now-secret communication

Matthew Green:

If there’s any surprise in the Barr letter, it’s not the government’s opposition to encryption. Rather, it’s the new reasoning that Barr provides to justify these concern. In past episodes, law enforcement has called for the deployment of “exceptional access” mechanisms that would allow law enforcement access to plaintext data. As that term implies, such systems are designed to treat data access as the exception rather than the rule. They would need to be used only in rare circumstances, such as when a judge issued a warrant.

The Barr letter appears to call for something much more agressive.

Rather than focusing on the need for exceptional access to plaintext, Barr focuses instead on the need for routine, automated scanning systems that can detect child sexual abuse imagery (or CSAI).

John Gruber:

They don’t use the word “backdoor” but that’s what they’re asking for. End-to-end encryption doesn’t allow for backdoors. So what they’re really asking is for Facebook not to use end-to-end encryption. And the only truly secure, truly private encryption for personal communication is end-to-end encryption. So, when you boil it all down and ignore the emotional pleas that would have you believe this is all about protecting children, what they’re really asking is for Facebook not to safeguard the security and privacy of the messaging of billions of people around the world.

Bruce Schneier:

In an extraordinary essay, the former FBI general counsel Jim Baker makes the case for strong encryption over government-mandated backdoors[…] Basically, he argues that the security value of strong encryption greatly outweighs the security value of encryption that can be bypassed.

Pete Williams (MacRumors):

The FBI is asking Apple Inc. to help unlock two iPhones that investigators think were owned by Mohammed Saeed Alshamrani, the man believed to have carried out the shooting attack that killed three people last month at Naval Air Station Pensacola, Florida.

Nick Heer:

As with the San Bernardino case, Apple says that it is cooperating with authorities. But, unlike that case, the FBI hasn’t yet tried to legally compel Apple into, for example, creating a special version of iOS that has no restrictions on passcode attempts. As with that case, it would set a troubling precedent that encryption should be weakened. So far, there is simply no practical or realistic way of doing so without breaking every user’s security.

John Gruber:

Honestly, I don’t think this has anything to do with the Pensacola shooter. I think this is part of a campaign to drum up public support for making true encryption illegal. And if it really is about the Pensacola shooter, the FBI’s leadership doesn’t understand how encryption works, which is disgraceful.

Nick Heer:

Twice now, the U.S. Department of Justice has pushed Apple to help decrypt iPhones involved in high-profile crimes. Twice, Apple has pushed back. And, twice, the popular press has framed these cases in terms that do not help their general-audience readers understand why Apple is refusing demands to cooperate; instead, using language that implicitly helps those who believe that our rights should be compromised to a lowest common denominator.

Juli Clover:

United States President Donald Trump this afternoon weighed in on a disagreement between Apple and the FBI, calling on Apple to “step up to the plate” and “help our great country” by unlocking the iPhones used by Florida shooter Mohammed Saeed Alshamrani.

Trump said that the U.S. is “helping Apple all of the time” but Apple refuses to “unlock” smartphones used by “killers, drug dealers and other violent criminal elements.”

Rene Ritchie:

So, what’s critical is to step back and really look at what’s being asked for here. No more secrets. The ability to get into not just a single criminal’s phone, but everybody’s phone. Yours and mine. And the ability for not just the FBI to get into it, but everybody. Foreign agencies and criminals.

John Gruber:

The big question remains unclear in all this coverage: did Apple refuse the DOJ’s request, or are they unable — technically — to fulfill the request? The DOJ continues to talk as though this is something Apple could do but refuses to.

Nick Heer:

To be clear, my iPhone still prompted for its passcode when the update had finished its installation process. This did not magically unlock my iPhone. It also doesn’t prove that passcode preferences could be changed without first entering the existing valid passcode.

But it did prove the existence of one channel where an iPhone could be forced to update to a compromised version of iOS. One that would be catastrophic in its implications for iPhones today, into the future, and for encrypted data in its entirety. It is possible; it is terrible.

David Sparks:

Apple sells into a lot of countries. Any one of them could require they install a back door as a condition of access to the market. Apple’s principals are on a collision course with a massive loss of income. Is it just a question of time before governmental regulation and market pressures make this period of time, where all citizens have relatively secured data and communications, only a temporary phase? I sure hope not.

Nick Heer:

Sparks is right: there will come a time that Apple will need to choose whether it will stand behind strong privacy and security, or if the monetary cost of doing so is simply too high.

Tim Hardwick (9to5Mac, Hacker News):

New questions have been raised about the FBI’s latest request that Apple break its iPhone encryption, after Forbes uncovered a search warrant strongly indicating that federal agents already have tools that can access data on Apple’s latest iPhone models.

William Gallagher:

Republican Senator Lindsey Graham is behind a draft bipartisan bill called the ‘Eliminating Abusive and Rampant Neglect of Interactive Technologies Act of 2019’ or EARN IT. Its stated aims are to “develop recommended best practices… regarding the prevention of online child exploitation.” However, the methods Graham proposes would effectively ban all end-to-end encryption.

Joe Rossignol:

FBI officials have still not managed to unlock a passcode-protected iPhone that investigators believe was owned by Mohammed Saeed Alshamrani, the perpetrator of a mass shooting at a Naval Air Station in Florida in December.

The disclosure was made by FBI director Christopher Wray at a House Judiciary Committee hearing today, according to Bloomberg. Wray told Rep. Matt Gaetz (R-FL) that the FBI is “currently engaged with Apple hoping to see if we can get better help from them so we can get access to that phone,” the report claims.

Tim Hardwick:

The director general of Britain’s Security Service is arguing for “exceptional access” to encrypted messages, in the ongoing battle between authorities and technology companies, reports The Guardian.

Matthew Green:

Yesterday a bipartisan group of U.S. Senators introduced a new bill called the EARN IT act. On its face, the bill seems like a bit of inside baseball having to do with legal liability for information service providers. In reality, it represents a sophisticated and direct governmental attack on the right of Americans to communicate privately.

I can’t stress how dangerous this bill is, though others have tried. In this post I’m going to try to do my best to explain why it scares me.

Previously:

Update (2020-03-27): Bruce Schneier:

Prepare for another attack on encryption in the U.S. The EARN-IT Act purports to be about protecting children from predation, but it's really about forcing the tech companies to break their encryption schemes[…]

1 Comment RSS · Twitter

If the legislation will be approved, only EU country based providers are safe as they are bonded by GDPR. US based and probably 5 eyes based providers will hand out the encryption keys to US authorities. Curiosity and questions is causingTelegram as this is run by some Rusky guys based on Seychelles with servers in Dubai and maybe in Saud and with ties to ???

Leave a Comment