Wednesday, January 27, 2021

It’s Over Between Us, AVAudioEngine

Chris Liscio:

Looking at the crash logs, I noticed that invoking -[AVAudioEngine mainMixerNode] would fire some kind of internal exception. The API does not return any kind of error, nor is it documented that an exception could get raised. Theoretically, failure at this stage should be impossible!

Unfortunately for me, I had written the high-level engine management code in Swift (as was common/prescribed at the time), so I couldn’t even attempt to handle this exception to patch this behavior.

[…]

Just as last time, a 10.14.x update re-introduced the bug, but in a slightly different way.

[…]

Fast-forward to macOS 11, and the AirPods issue is back. In fact, it’s even worse now because there’s no workaround. […] If you’re listening to the Music app, everything’s running at full quality. Then, you load a project in Capo and your already-playing music now sounds like garbage.

Colin Cornaby:

AVAudioEngine has always struck me as a API with good intentions (doing what CoreAudio does in Obj-C/Swift) with questionable execution and missing features.

It’s just so strange that Apple would take their extremely impressive and feature rich audio toolbox and replace it with... well... this.

Update (2021-01-28): Simone Manganelli:

This has happened over and over and over with APIs throughout macOS in the past decade. PDFKit is a good example.

This is just another symptom of Apple’s annual software update cycle and a lack of commitment to fixing anything.

Francisco Tolmasky:

The Swift/ObjC relationship is like taking the traditional design of a game engine written in C++ with Lua scripting hooks and flipping it on its head, such that you instead wrote the game engine in JavaScript and provided the scripting interface in Haskell.

You can use an Objective-C wrapper to catch exceptions, but be careful trying to make a generic wrapper solution because it’s not safe to throw exceptions through Swift stack frames.

Chris Liscio:

Great news: I have found a resolution to the (latest of the) AVAudioEngine issues I’ve been seeing with AirPods in Big Sur.

[…]

I am still unable to work around the problem in my code and stop the error for my users. These are in-the-field configuration issues that I am powerless to resolve with a simple software update.

AVAudioEngine is still too limited in its feature set for me to consider (or advise) adopting it for a “pro audio” stack like that which powers something like GarageBand or Logic.

Chris Liscio:

I put together a sample project called CrappifyAudio that demonstrates the problem in a minimal way.

Update (2021-02-05): Chris Liscio (tweet):

AVAudioEngine doesn’t give me an opportunity to state my “intentions” for using the engine explicitly on macOS. There is no API that lets me specify that I am building an output-only engine, and don’t require its input abilities.

The previous workaround was to pull the AUAudioUnit instance out of the outputNode and set its deviceID to override the default device selection behavior. Unfortunately, this workaround no longer works because the aggregate device now appears to get created (activated?) when I call outputNode.

[…]

This is the crux of the problem I’m dealing with here, and why I don’t want anything to do with this API anymore. It’s simply too magical for my tastes.

1 Comment RSS · Twitter


Sigh. This API really does waaaant to be good, but the pace at which it's being improved and internal problems fixed is just far far too slow. It seems to me that the general usefulness of it (like not being able to do an output-only graph) is hampered by an initial set of requirements to support games (and similar uses) on the iPhone. It came right after SpriteKit etc and I think that's not a coincidence, which is why it still pales in comparison to the full Core Audio APIs on macOS. Each WWDC I keep hoping it's the year they're finally going to beef it up and make it be the full replacement to Core Audio like AVFoundation and VideoToolbox were to QuickTime.

Leave a Comment