Thursday, November 6, 2025

AirPods Live Translation Expands to the EU

Apple (9to5Mac, MacRumors, Reddit):

Next month, Live Translation on AirPods will expand to the EU, making face-to-face conversations easier by helping users communicate even if they don’t speak the same language.

[…]

Live Translation on AirPods is available in English, French, German, Portuguese, Spanish, Italian, Chinese (Simplified and Traditional Mandarin), Japanese, and Korean when using AirPods Pro 3, AirPods Pro 2, or AirPods 4 with ANC paired with an Apple Intelligence-enabled iPhone running the latest software. Live Translation on AirPods was delayed for users in the EU due to the additional engineering work needed to comply with the requirements of the Digital Markets Act.

So what happened here? What was this extra engineering work? Back in September, Apple said:

For example, we designed Live Translation so that our users’ conversations stay private — they’re processed on device and are never accessible to Apple — and our teams are doing additional engineering work to make sure they won’t be exposed to other companies or developers either.

But it doesn’t sound like Apple has opened up Live Translation to third-party Bluetooth devices or to third-party apps. Does the DMA not require that? Or is Apple actually doing that but deliberately left it out of the announcement?

The other main theory for the delay was that Apple had not yet shown the regulators how the feature complied with the GDPR. But would that require “additional engineering work”? Apple was cagey before but now specifically blames the DMA, not the GDPR.

Based only on a plain reading of the public statements above, the logical conclusion is that the initial version of Live Translation had privacy flaws, which the EU forced Apple to address before shipping in that region. That would be a very interesting story and completely at odds with Apple’s framing that the EU’s demands would reduce privacy.

There are some other possibilities. Maybe the feature just wasn’t ready before. Maybe Apple created a false conflict to drum up anti-DMA sentiment. Maybe the EU caved and let Apple ship the feature without changes—though that doesn’t explain the additional engineering. All of Apple’s communication about this feature in the EU seems designed to obscure rather than elucidate, so who knows?

Nick Heer:

If Apple wants to be petty and weird about the DMA in its European press releases, I guess that is its prerogative, though I will note it is less snippy about other regulatory hurdles. Still, I cannot imagine a delay of what will amount to three-ish months will be particularly memorable for many users by this time next year.

Previously:

Update (2025-11-07): Nick Heer:

Tsai is referencing Apple’s Digital Markets Act press release. After listing the features delayed in the E.U., one of which is Live Translation, and all attributed to the DMA, it goes on to say (emphasis mine):

We’ve suggested changes to these features that would protect our users’ data, but so far, the European Commission has rejected our proposals. And according to the European Commission, under the DMA, it’s illegal for us to share these features with Apple users until we bring them to other companies’ products. If we shared them any sooner, we’d be fined and potentially forced to stop shipping our products in the EU.

[…]

I do not see anything in the release notes about greater third-party support or new APIs.

So it’s still a mystery. Dan points to another theory from numerama, but I don’t think it makes sense given Apple’s statements, either.

Update (2025-11-26): Nicolas Lellouche (translation via Apple):

To offer live translation in Europe, Apple says it has developed a new API to allow other applications to manage several audio streams simultaneously, like its service. For example, it will allow competitors, such as Google Translate or Duolingo, to simultaneously use the iPhone’s microphone and speaker, as well as the AirPods’ microphones and headphones. The idea is to be able to create a closed audio channel, to use the iPhone as a microphone and get a return in the headphones.

So there is a new API, but it doesn’t seem like Apple has announced or documented it.

Previously:

3 Comments RSS · Twitter · Mastodon


As a European I'm used to waiting months or years for new features or gadgets from both Apple and Google.

I don't see anything new here. It's business as usual, except dressed up in self entitled rage by a company who is used to placing lumps of gold on world leaders desks to get what they want.


According to the French numerama publication, the issue with Live Translation was that Apple is using audio streams from the AirPods and the iPhone simultaneously. This feature was apparently only accessible through a private API so that competitors were at a disadvantage if they wanted to recreate "Live Translation" on iPhones.

https://www.numerama.com/tech/2109757-la-traduction-en-direct-arrive-sur-les-airpods-en-france-mais-apple-a-du-faire-un-sacrifice-en-europe.html


@Dan Thanks. That makes logical sense but doesn’t seem to fit with Apple’s statement about preventing other companies and developers from accessing private data.

Leave a Comment