Wednesday, October 25, 2023

AirPods Pro With iOS 17

Joe Rossignol:

iOS 17 adds several features to all second-generation AirPods Pro, including Adaptive Audio, Conversation Awareness, and Personalized Volume. Keep in mind that these software features are also available on the original second-generation AirPods Pro released in September 2022, so there is no need to update to the USB-C model to use them.

Tim Hardwick:

Keep reading to learn what use cases the new noise control features are designed for, and how you can control them in iOS 17 when your iPhone is connected to AirPods Pro 2 with updated firmware.

Six Colors:

There’s a new listening mode, Adaptive Audio, that sort of sits in between Noise Cancellation and Transparency modes. (There’s even a new sound effect when you enter this mode, distinct from the chimes for the other modes.) According to Apple, when you’re in this mode, noise cancellation is emphasized in noisier environments and Transparency in quieter conditions.

I’ve spent the entire summer walking and running around my neighborhood and taking plane trips using a beta version of this new firmware. It’s basically replaced Transparency mode for me—in fact, I’ve set my AirPods Pro to toggle exclusively between Noise Cancellation and Adaptive Audio.

[…]

Another new feature is Personalized Volume, which supposedly adjusts playback volume as your environment changes as well as based on learning your preferences in various contexts. I’ve kept this feature on all summer, but it never felt like it worked right. Perhaps it was doing some amazing adjustments and I just never noticed, but what it mostly seemed to do was turn down the volume on my podcasts just as I was about to head out the door for a run. I would invariably turn the volume back up, but it never seemed to learn its lesson.

Tim Hardwick:

In this way, Adaptive Audio aims to automatically reduce loud or distracting noises in your surroundings, such as the sound of a leaf blower or a passing plane overhead, while other noises, like the sudden beep of a car horn, remain audible.

In a new interview with TechCrunch, Apple’s VP of sensing and connectivity Ron Huang revealed that Apple originally considered using GPS location to inform AirPods Pro of the user’s whereabouts and adapt the audio experience accordingly. In real-world testing, however, the method proved inefficient.

Previously:

Update (2023-10-27): John Gruber:

I’ve been wearing a pair of the revised AirPods Pro 2 earbuds since last week, paired with my year-old iPhone 14 Pro and a few other devices. I obviously can’t say anything about their special capabilities when paired with a Vision Pro, but in all regards related to currently-shipping features, they’re better than ever.

[…]

Conversation Awareness really is completely automatic. If you’re listening to music or a podcast and just start talking to someone, or if someone else just starts talking to you, it kicks in. It’s very clever, but whether you’ll enjoy it highly depends upon your listening environment. In my 5+ days of testing, it kicked in too frequently amidst a crowd of people, none of whom were talking to me. Sometimes on city sidewalks, oftentimes in a grocery store. In an urban environment, there are just too many people talking around me, and the AirPods have no way of knowing that they’re not talking to me, for this feature to be anything but an annoyance overall.

[…]

Transparency with AirPods Pro 2 has been great as an urban pedestrian; Adaptive is even better. It just automatically Does What I Want™ in seemingly every context. I hear traffic and passersby, but even loud trucks and buses passing by don’t keep me from clearly hearing the podcast (typically) or song (less typically) I’m listening to.

1 Comment RSS · Twitter · Mastodon

I find adaptive audio to be fantastic. I use AirPods in an urban environment while walking and biking, and it’s the middle ground I never knew I needed. Truly great work on Apple’s part.

Jury’s out for me on conversational awareness. I certainly don’t find it useful for keeping headphones in while talking to someone. That’s both socially awkward IMO, as well as distracting as the audio ducking doesn’t know if there’s a pause in conversation or if it’s over, so it cuts in and out at odd times. That said, I have found it useful when walking around where occasionally I interact with a stranger for a few seconds (think: getting on a bus and saying hello to the driver) where taking headphones out of my ears is a hassle. But then again, when using adaptive audio in conjunction with conversation awareness, it seems like the audio ducks, but ambient noise doesn’t fade in quick enough. In these situations I think I want adaptive audio to instantly switch to fully transparent mode. It’s intentionally designed to be a gradual transition however.

Personalized volume was so random that I turned it off after less than a week. Sometimes I’d realize it had magically adjusted volume well. But too frequently I’d find my volume suddenly getting quiet or loud for a seemingly random reason. The miss rate is way too high, it should be labeled a beta feature.

One other note: the faster device switching is a massive improvement. And for the first time since Apple added auto-switching as a feature, it truly feels like it works as advertised. The simultaneous improvements to both those features means I am starting to trust that my AirPods will play from the right device instantly, without my having to manually select them from the device I’m using.

Leave a Comment