Archive for February 7, 2025

Friday, February 7, 2025

UK Orders Apple to Break iCloud Advanced Data Protection

Dominic Preston (Hacker News, MacRumors):

Apple has reportedly been ordered by the UK government to create a backdoor that would give security officials access to users’ encrypted iCloud backups. If implemented, British security services would have access to the backups of any user worldwide, not just Brits, and Apple would not be permitted to alert users that their encryption was compromised.

The Washington Post reports that the secret order, issued last month, is based on rights given under the UK’s Investigatory Powers Act of 2016, also known as the Snoopers’ Charter. Officials have apparently demanded blanket access to end-to-end encrypted files uploaded by any user worldwide, rather than access to a specific account.

[…]

The UK has reportedly served Apple a document called a technical capability notice. It’s a criminal offense to even reveal that the government has made a demand. Similarly, if Apple did accede to the UK’s demands then it apparently would not be allowed to warn users that its encrypted service is no longer fully secure.

Dan Moren:

While law enforcement has long been able to access encrypted data for which Apple holds the keys, this move would reportedly apply to end-to-end data in which the user holds the keys, such as Apple’s Advanced Data Protection. This law would target end-to-end encrypted data from Google and Meta as well.

This is red alert, five-alarm-fire kind of stuff. Providing a backdoor would be worrying enough for reasons that should be obvious to anybody who knows the barest inkling about technology—to wit, that there exists no mechanism to keep such a tool out of the hands of malicious actors—but the fact that it would apply beyond the UK borders to other countries is a staggering breach of sovereignty. And, moreover, as Menn points out, such a move would no doubt embolden other powers to ask for access to the same capabilities—such as China.

[…]

Ironically, the biggest impediment might come in the form of the European Union, as Apple apparently argued that the implementation would undermine the European right to privacy.

Nick Heer:

In any case, the reported demands by the U.K. government are an extraordinary abuse of their own. It has global implications for both U.K. access and, I would venture, access by its allies. As a reminder, U.S. and U.K. spy agencies routinely shared collected data while avoiding domestic legal protections. This order explicitly revives the bad old days of constant access.

Tim Hardwick:

According to sources that spoke to the publication, Apple is likely to stop offering encrypted storage in the UK as a result of the demand. Specifically, Apple could withdraw Advanced Data Protection, an opt-in feature that provides end-to-end encryption (E2EE) for iCloud backups, such as Photos, Notes, Voice Memos, Messages backups, and device backups.

In this scenario, UK users would still have access to basic iCloud services, but their data would lack the additional layer of security that prevents even Apple from accessing it.

Previously:

Update (2025-02-10): Mike Masnick:

While officials repeatedly insisted they weren’t trying to break encryption entirely, those of us following closely saw this coming. Apple even warned it might have to exit the UK market if pushed too far.

[…]

The UK government is demanding that Apple fundamentally compromise the security architecture of its products for every user worldwide. This isn’t just about giving British authorities access to British users’ data — it’s about creating a master key that would unlock everyone’s encrypted data, everywhere.

This is literally breaking the fundamental tool that protects our privacy and security. Backdoored encryption is not encryption at all.

[…]

This global reach is particularly concerning given the UK’s membership in the Five Eyes intelligence alliance. Any backdoor created for British authorities would inevitably become a tool for intelligence and law enforcement agencies across the US, Australia, Canada, and New Zealand — effectively creating a global surveillance capability without any democratic debate or oversight in those countries.

Bruce Schneier:

Apple is likely to turn the feature off for UK users rather than break it for everyone worldwide. Of course, UK users will be able to spoof their location. But this might not be enough. According to the law, Apple would not be able to offer the feature to anyone who is in the UK at any point: for example, a visitor from the US.

And what happens next? Australia has a law enabling it to ask for the same thing. Will it? Will even more countries follow?

This is madness.

Mark Nottingham (via Hacker News):

The UK is presumably interested in Apple providing this functionality because iCloud’s design conveniently makes a massive amount of data convenient to access in one location: Apple’s servers. If that data is instead spread across servers operated by many different parties, it becomes less available.

In effect, this is the decentralize iCloud option. Apple would open up its implementation of iCloud so that third-party and self-hosted providers could be used for the same functions. They would need to create interfaces to allow switching, publish some specifications and maybe some test suites, and make sure that there weren’t any intellectual property impediments to implementation.

[…]

This isn’t a perfect option. Orders could still force weakened encryption, but now they’d have to target many different parties (depending on the details of implementation and deployment), and they’d have to get access to the stored data. If you choose a provider in another jurisdiction, that makes doing so more difficult, depending on what legal arrangements are in place between those jurisdictions; if you self-host, they’ll need to get physical access to your disks.

Update (2025-03-14): Nick Heer:

If Google had not received a technical capabilities notice, it would be able to simply say “no”. Because it says it cannot say anything “if it had”, it seems likely it has also been issued a similar demand for access to user data in a decrypted form.

Update (2025-03-18): Tim Hardwick:

Two human rights groups have filed a legal complaint with the UK’s Investigatory Powers Tribunal (IPT) in an attempt to quash the UK government’s demand for Apple to allow backdoor access to its encrypted data (via Financial Times).

Zack Whittaker (via John Gruber):

A group of bipartisan U.S. lawmakers are urging the head of the U.K.’s surveillance court to hold an open hearing into Apple’s anticipated challenge of an alleged secret U.K. government legal demand.

Update (2025-04-08): Tim Hardwick (Hacker News):

Apple has filed a legal appeal against a UK government order requiring the company to create a “back door” to its encrypted cloud storage systems, the Investigatory Powers Tribunal (IPT) confirmed on Monday (via Reuters). The confirmation means that the Home Office cannot keep all the details of its demand out of the public domain.

[…]

According to the IPT ruling, the British government had sought to keep details of the case private. The Home Office argued that publicizing the existence of the appeal could damage national security, but Judges Rabinder Singh and Jeremy Johnson rejected this claim.

Nick Heer:

The public copy of the ruling (PDF) is kind of funny to read because of the secrecy rules around the technical capability notice. Presumably, this judge would know whether Apple had been issued this demand, but cannot say so. They therefore extensively refer to media reporting about the demand; in response to U.S. lawmakers’ request to “discuss an alleged technical capability notice”, emphasis my own.

SpamSieve 3.1.1

SpamSieve 3.1.1 improves the filtering accuracy of my Mac e-mail spam filter, amongst other enhancements and fixes.

The update was held up because the Developer ID Notary Service was down for most of the business day yesterday.

Some interesting issues were:

Previously:

DeepSeek Privacy Issues

Dan Goodin:

On Thursday, mobile security company NowSecure reported that the app sends sensitive data over unencrypted channels, making the data readable to anyone who can monitor the traffic. More sophisticated attackers could also tamper with the data while it’s in transit. Apple strongly encourages iPhone and iPad developers to enforce encryption of data sent over the wire using ATS (App Transport Security). For unknown reasons, that protection is globally disabled in the app, NowSecure said.

[…]

What’s more, the data is sent to servers that are controlled by ByteDance, the Chinese company that owns TikTok. While some of that data is properly encrypted using transport layer security, once it’s decrypted on the ByteDance-controlled servers, it can be cross-referenced with user data collected elsewhere to identify specific users and potentially track queries and other usage.

Ben Lovejoy:

The latest findings are far worse than the previous security failure which exposed chat history and other sensitive information in a database requiring no authentication …

Brian Krebs:

Beyond security concerns tied to the DeepSeek iOS app, there are indications the Chinese AI company may be playing fast and loose with the data that it collects from and about users. On January 29, researchers at Wiz said they discovered a publicly accessible database linked to DeepSeek that exposed “a significant volume of chat history, backend data and sensitive information, including log streams, API secrets, and operational details.”

“More critically, the exposure allowed for full database control and potential privilege escalation within the DeepSeek environment, without any authentication or defense mechanism to the outside world,” Wiz wrote.

William Gallagher:

NowSecure says it is continuing to research DeepSeek. It notes that the Android version is even less secure than the iOS one.

Previously:

Screenshot-Reading Malware

Wes Davis:

Apps distributed through both Apple and Google’s app stores are hiding malicious screenshot-reading code that’s being used to steal cryptocurrency, the cybersecurity software firm Kaspersky reported today. It’s the “first known case” of apps infected with malware that uses OCR tech to extract text from images making it into Apple’s App Store, according to a blog post detailing the company’s findings.

Kaspersky says it discovered the code from this particular malware campaign, which it calls “SparkCat,” in late 2024 and that the frameworks for it appear to have been created in March of the same year.

Via Guy English:

This is the kind of thing that makes tech so annoying these days. What’s a platform to do? At the scale of adoption of these devices (both Apple and Android) there are countless people who’d not think twice about agreeing to photo access without thinking for a moment of the screenshot with their credentials they saved off a long time ago. The only solution I can think of is only using system UI to pick what apps see. Which we have now. But that’s kind of annoying too.

Bruce Schneier:

That’s a tactic I have not heard of before.

Juli Clover:

Kaspersky located several App Store apps with OCR spyware, including ComeCome, WeTink, and AnyGPT, but it is not clear if the infection was a “deliberate action by the developers” or the “result of a supply chain attack.”

[…]

Apple checks over every app in the App Store , and a malicious app marks a failure of Apple’s app review process. In this case, there does not appear to be an obvious indication of a trojan in the app, and the permissions that it requests appear to be needed for core functionality.

Juli Clover:

Apple pulled the apps from the App Store.