Apple (via Ivan Krstić, ArsTechnica):
Apple Intelligence is the personal intelligence system that brings powerful generative models to iPhone, iPad, and Mac. For advanced features that need to reason over complex data with larger foundation models, we created Private Cloud Compute (PCC), a groundbreaking cloud intelligence system designed specifically for private AI processing. For the first time ever, Private Cloud Compute extends the industry-leading security and privacy of Apple devices into the cloud, making sure that personal user data sent to PCC isn’t accessible to anyone other than the user — not even to Apple. Built with custom Apple silicon and a hardened operating system designed for privacy, we believe PCC is the most advanced security architecture ever deployed for cloud AI compute at scale.
[…]
The root of trust for Private Cloud Compute is our compute node: custom-built server hardware that brings the power and security of Apple silicon to the data center, with the same hardware security technologies used in iPhone, including the Secure Enclave and Secure Boot. We paired this hardware with a new operating system: a hardened subset of the foundations of iOS and macOS tailored to support Large Language Model (LLM) inference workloads while presenting an extremely narrow attack surface. This allows us to take advantage of iOS security technologies such as Code Signing and sandboxing.
On top of this foundation, we built a custom set of cloud extensions with privacy in mind. We excluded components that are traditionally critical to data center administration, such as remote shells and system introspection and observability tools. We replaced those general-purpose software components with components that are purpose-built to deterministically provide only a small, restricted set of operational metrics to SRE staff. And finally, we used Swift on Server to build a new Machine Learning stack specifically for hosting our cloud-based foundation model.
[…]
Since Private Cloud Compute needs to be able to access the data in the user’s request to allow a large foundation model to fulfill it, complete end-to-end encryption is not an option. Instead, the PCC compute node must have technical enforcement for the privacy of user data during processing, and must be incapable of retaining user data after its duty cycle is complete.
[…]
Every production Private Cloud Compute software image will be published for independent binary inspection — including the OS, applications, and all relevant executables, which researchers can verify against the measurements in the transparency log.
Matthew Green:
Then they’re throwing all kinds of processes at the server hardware to make sure the hardware isn’t tampered with. I can’t tell if this prevents hardware attacks, but it seems like a start.
They also use a bunch of protections to ensure that software is legitimate. One is that the software is “stateless” and allegedly doesn’t keep information between user requests. To help ensure this, each server/node reboot re-keys and wipes all storage.
[…]
Of course, knowing that the phone is running a specific piece of software doesn’t help you if you don’t trust the software. So Apple plans to put each binary image into a “transparency log” and publish the software.
But here’s a sticky point: not with the full source code.
Security researchers will get some code and a VM they can use to run the software. They’ll then have to reverse-engineer the binaries to see if they’re doing unexpected things. It’s a little suboptimal.
And I don’t understand how you can tell whether the binary image in the log is actually what’s running on the compute node.
Matthew Green:
As best I can tell, Apple does not have explicit plans to announce when your data is going off-device for to Private Compute. You won’t opt into this, you won’t necessarily even be told it’s happening. It will just happen. Magically.
[…]
Wrapping up on a more positive note: it’s worth keeping in mind that sometimes the perfect is the enemy of the really good.
[…]
I would imagine they’ll install these servers in a cage at a Chinese cloud provider and they’ll monitor them remotely via a camera. I don’t know how you should feel about that.
Aside from the source code issue, it’s not clear to me what more Apple could reasonably do. Let researches inspect the premises? They’re making a strong effort, but that doesn’t mean this system is actually as private as on-device. You have to trust their design and all the people implementing it and hope there aren’t any bad bugs.
Matthew Green:
It’s a very thoughtful design. Indeed, if you gave an excellent team a huge pile of money and told them to build the best “private” cloud in the world, it would probably look like this.
Francisco Tolmasky:
I’ve asked a lot of people: “OK, imagine Facebook implemented the same system, you’d be fine using it?” Their answer was “Well, no…” Because at the end of the day this system still fundamentally relies on trust. None of this stuff is actually verifiable. And that becomes crystal clear when you realize that you wouldn’t trust it if you simply switched out the names. No one is saying they’re not trying, but that’s different than having created an actually secure system.
Francisco Tolmasky:
Shell game: We put the data under the “local processing cup,” mention you need servers, start swapping cups around, invent a nonsense term “Private Cloud Compute” & voila! These are SPECIAL servers. That’s how you go from “local matters” to “we’re doing it on servers!”
Francisco Tolmasky:
Something that gets lost in discussions about trust is the kind of trust you actually need. Plenty of people trust Apple’s intentions. But with the cloud you actually further need to trust they, e.g., never write any bugs. That they have perfect hiring that catches someone trying to infiltrate them, despite it being super tempting for a gov to try. That they’ll shut the whole feature down if a gov passes a data retention law. This seems pedantic, but these were Apple’s own arguments in the past.
Jeff Johnson:
The so-called “verifiable transparency” of Private Cloud Compute nodes is a bad joke. They’re mostly closed source, so security researchers would have to reverse engineer almost everything. That’s the opposite of transparency.
Only Apple could claim that closed source is transparent. Orwellian doublespeak.
Previously:
Update (2024-06-18): Sean Peisert:
My question is why Apple is doing Private Cloud Computing rather than Confidential Computing (e.g., AMD SEV, Intel TDX) to have entirely hardware-enforced isolation, and I guess the obvious answer is that they haven’t built that level of technology into Apple Silicon yet.
Rob Jonson:
You still need to trust that Apple is running the software they say they are.
You also need to trust that they can ignore the NSA if they get an NSA letter demanding that they secretly change the software to enable NSA snooping.
They can’t tell you if the NSA demands that.
Khaos Tian:
Did I miss something on Apple’s PCC setup? If the attestation chain of trust is ultimately traced back to a private key Apple manages, wouldn’t they be able to fake attestation and trick the end device to talk to nodes that’s running non public PCC software?
Update (2024-06-24): Saagar Jha:
Apple seems to just categorically fail at threat models that involve themselves. I guess for iPhone you just suck it up and use it anyway but for this the whole point is that it’s supposed to be as secure as on-device computation so this is kind of important.
Even shelving insider threat, there are a lot of words for “we did TPM”.
[…]
To be 100% clear: you know how NSO or Cellebrite keep hacking iPhones? This thing is made so that if you do that to PCC, you get to see what is going on inside of it. And because of how TPMs work it will likely send back measurements to your phone that attest cleanly.
The “solution”, as far as I can tell, is that Apple thinks they would catch attempts to hack their servers. Oh yeah also hacking the server is hard because they used Swift and deleted the SSH binary. Not like they ship an OS like that already to a billion people.
Also other people have been grumbling about this but I’ll come out and say it: gtfo with your “auditability”. You don’t care about auditability. You care about your intellectual property. This blog post is hilariously nonsensical.
See also: James Darpinian.
Update (2024-07-02): Rich Mogull:
Here is where Apple outdid itself with its security model. The company needed a mechanism to send the prompt to the cloud securely while maintaining user privacy. The system must then process those prompts—which include sensitive personal data—without Apple or anyone else gaining access to that data. Finally, the system must assure the world that the prior two steps are verifiably true. Instead of simply asking us to trust it, Apple built multiple mechanisms so your device knows whether it can trust the cloud, and the world knows whether it can trust Apple.
[…]
So, Apple can’t track a request back to a device, which prevents an attacker from doing the same unless they can compromise both Apple and the relay service. Should an attacker actually compromise a node and want to send a specific target to it, Apple further defends against steering by performing statistical analysis of load balancers to detect any irregularities in where requests are sent.
[…]
Apple will publish the binary images of the software stack running on PCC nodes. That’s confidence and a great way to ensure the system is truly secure—not just “secure” because it’s obscure.
I don’t know—a binary image is certainly on the spectrum to obscurity. And it is still not clear to me how it can be proven that the image that you inspected is the same as the one that’s actually running on the node.
Update (2024-09-13): Lily Hay Newman (via John Voorhees):
“We set out from the beginning with a goal of how can we extend the kinds of privacy guarantees that we’ve established with processing on-device with iPhone to the cloud—that was the mission statement," Craig Federighi, senior vice president of software engineering at Apple, tells WIRED. “It took breakthroughs on every level to pull this together, but what we’ve done is achieve our goal. I think this sets a new standard for processing in the cloud in the industry.”
Nick Heer:
I would hope so — an iPhone 15 with an A16 chip is not compatible with Apple Intelligence. An iPhone 15 Pro and its A17 Pro chip would be a better comparison. I do not know whether this error is Apple’s or the reporter’s, but it has survived a full day since the article’s publication.
[…]
Wired appended a cheeky note to the article saying it “was updated with clarification on the Apple Intelligence-generated image Federighi created for his dog’s birthday and additional confirmation that she is a very good dog”.
They “corrected” that and added the name of his dog but didn’t fix the substantive error.
Apple Intelligence Apple Security Bounty Artificial Intelligence Cloud iOS iOS 18 Mac macOS 15 Sequoia Privacy Private Cloud Compute Secure Enclave Swift Programming Language
Benjamin Mayo (Hacker News):
App Review has rejected a submission from the developers of UTM, a generic PC system emulator for iPhone and iPad.
The open source app was submitted to the store, given the recent rule change that allows retro game console emulators, like Delta or Folium. App Review rejected UTM, deciding that a “PC is not a console”. What is more surprising, is the fact that UTM says that Apple is also blocking the app from being listed in third-party app stores in the EU.
As written in the App Review Guidelines, Rule 4.7 covers “mini apps, mini games, streaming games, chatbots, plug-ins and game emulators”.
UTM says Apple refused to notarize the app because of the violation of rule 4.7, as that is included in Notarization Review Guidelines. However, the App Review Guidelines page disagrees. It does not annotate rule 4.7 as being part of the Notarization Review Guidelines. Indeed, if you select the “Show Notarization Review Guidelines Only” toggle, rule 4.7 is greyed out as not being applicable.
UTM:
Apple has reached out and clarified that the notarization was rejected under rule 2.5.2 and that 4.7 is an exception that only applies to App Store apps (but which UTM SE does not qualify for).
This is confusing, but I think what Apple is saying is that, even with notarization, apps are not allowed to “download executable code.” Rule 2.5.2 says apps may not “download, install, or execute code” except for limited educational purposes. Rule 4.7 makes an exception to this so that retro game emulators and some other app types can run code “that is not embedded in the binary.” This is grayed out when you select Show Notarization Review Guidelines Only, meaning that the exception only applies within the App Store. Thus, the general prohibition remains in effect for App Marketplaces and Web Distribution. But it seems like this wasn’t initially clear to Apple, either, because the review process took two months.
This also seems inconsistent with the fact that the Delta emulator is allowed to be notarized outside the App Store. It doesn’t make much sense for the rules to be more lax within the App Store. I first thought the mistake was that Apple didn’t mean to gray out 4.7 for notarization. Then everything would make sense. But the clarification states that 4.7 is not intended to apply to notarization.
The bottom line for me is that Apple doesn’t want general-purpose emulators, it’s questionable whether the DMA lets it block them, and even siding with Apple on this it isn’t consistently applying its own rules.
kelthuzad:
If Apple can block what’s on “independent” third-party app stores, then the letter of the DMA may be violated or not, but its spirit is most certainly violated. Hope the EU cracks down on such malicious compliance.
Steve Troughton-Smith:
Apple needs to read the terms of the DMA again; Apple can’t reject UTM from distribution in third party marketplaces, in just the same way it can’t prevent Epic from building an App Store. App Review is going to land them yet another
clash with the EU, and potential fine-worthy rule violation.
Thomas Clement:
Sigh… what is even the point of third-party distribution if Apple is going to block whatever competition it does not want to see there?
Miguel Arroz:
This is so stupid. UTM is an essential tool for my work, running stuff I need 24/7. This shows that 1. The EU didn’t go far enough in telling tech companies the products people buy belong to them and they must be able to run whatever the hell they want on those products regardless of what some multinational company likes it or not, and 2. Every platform Apple makes is not targeted for real work and productivity except macOS and that’s mostly for historic reasons.
UTM:
We will adhere by Apple’s content and policy decision because we believe UTM SE (which does not have JIT) is a subpar experience and isn’t worth fighting for. We do not wish to invest any additional time or effort trying to get UTM SE in the App Store or third party stores unless Apple changes their stance.
gorkish:
I remember the flash-in-the-pan moment where through some strange conflux of exploits and firmware features UTM on iOS was able to access full hardware virtualization support. It was a glorious glimpse into an alternate reality that we will likely never get to see again.
I don’t have enough superlatives to express my disappointment when seeing all of that effort suppressed and restricted by Apple.
When the UTM authors say “it’s not worth it” -- they may be onto something. Apple is slowly but surely beginning to be “not worth it” for me and for many other professional users.
Previously:
Update (2024-06-19): John Gruber:
Apple’s stance on this seems inscrutable and arbitrary: retro game emulators are, at long last, acceptable, but general PC emulators are not. Such arbitrary policy decisions related to the purpose of the app are fine for the App Store (legally speaking), but clearly not compliant with the DMA. That’s one of the few areas where the DMA is clear. Apple can, of course, ban (say) porno apps from the App Store, but can’t refuse to notarize them for distribution outside the App Store in the EU.
Apple has a security leg to stand on when it comes to JIT compilation, but the version of UTM (UTM SE) that was held up in review for two months, and ultimately rejected by Apple, doesn’t use a JIT. […] That restriction should, in theory, be permitted under the DMA on security grounds. But how the no-JIT version of UTM could be rejected for notarization, I do not see.
And, again, Delta is a retro game emulator, but that’s, officially at least, not why it’s able to be notarized, because the retro game emulator exception doesn’t apply for notarization. If Apple were being consistent it would either notarize both Delta and UTM or neither.
Jason Snell:
In other words, parts of Apple apparently think that they can enforce inconsistent and arbitrary rules even outside the App Store, which is contrary to the entire regulatory process that led to the DMA and the concept of alternative App Stores in the first place.
[…]
The whole point of the DMA is that Apple does not get to act as an arbitrary approver or disapprover of apps. If Apple can still reject or approve apps as it sees fit, what’s the point of the DMA in the first place?
See also: Accidental Tech Podcast, Ben Lovejoy.
Previously:
Antitrust App Marketplaces App Store App Store Rejection Delta Emulator DMA Compliance Emulator European Union iOS iOS 17 iOS App Just-In-Time Compilation (JIT) Notarization Open Source Top Posts UTM Web Distribution of iOS Apps Windows