Audio Unit Extensions on iOS
The upcoming iOS 9 update seems to have some exciting features for us musicians. The first one is multitasking which will allow us to use two apps at the same time. This is going to be quite useful if you want to control two synth apps at the same time, or if you want to record a synth app in a DAW app.
I was already quite excited about that one, but I just discovered an Apple demo code for another feature, which might render this usage of multitasking useless again.
I'm talking about Audio Unit extensions. There had been some rumors about that since Monday (the start of the WWDC). The iOS 9 change log says:
The Audio Unit extension point allows your app to provide musical instruments, audio effects, sound generators, and more for use within apps like GarageBand, Logic, and other Audio Unit host apps. The extension point also brings a full audio plug-in model to iOS and lets you sell Audio Units on the App Store.
The example app (which can be found here: AudioUnitV3Example: A Basic AudioUnit Extension and Host Implementation) finally clarifies what it really is. Please note, that at this point, I haven't watched the corresponding WWDC session video because it is not yet available. Things might turn out to be different from what I know so far.
A little background: With inter-app audio (IAA), a host app can build an audio-processing graph that contains other (client) apps. The most popular example of a host app like that is Audiobus (although that app was available long before IAA). By introducing that technology in iOS 8, Apple allowed apps to behave like plug-ins while maintaining the independence of each app.
Apple went one step further now and allows host apps to present the UI of the client apps directly in its own interface. That means switching between the host and the client apps is no longer necessary. This is a lot like it is on Mac OS, where a host app (like Logic) presents the windows of the Audio Units itself.
On iOS 9, this is possible now through a technology that was actually already introduced in iOS 8: app extensions. They allow an app to show its UI in another app. For example, the Facebook app has an extension that allows you to directly post an image from the photos app. Prior to iOS8, you had to go to the Facebook app and select the photo from there.
Audio Unit extensions bring that concept to audio apps.
Hopefully this does not only bring ease of use in terms of the UI. IAA did not have anything like Audiobus' state saving. So, although a host app could save the audio graph the user created, it could not save the state of the client apps. For comparison: If you load a Logic project on Mac OS, your Audio Units restore their states, of course. With Audio Unit extensions, I hope this is possible on iOS as well. DAWs would profit from that, and iMIDIPatchbay (which I develop, ahem) could finally manage different states of your synth apps in a far easier way than by sending Program Changes. (Update: Yes, see below!)
I will further investigate the example code, while waiting for the WWDC video.
Update: The WWDC session video introduces the new version 3 API that is object oriented (as opposed to the old C API) and makes a lot of things easier. Some "bridges" should make it easy to allow hosts or clients that are built on different API versions work with each other. The new API lets Audio Units work in their own process on MacOS. This has the advantage that a crash of the Audio Unit won't crash the host. It also brings some performance overhead, which is why hosts and clients can opt out from that.
It seems to be very easy to bring Mac OS Audio Units to iOS because the whole audio code is the same. The only differences are related to the different UI frameworks on both platforms (UIKit vs. App Kit). This means, that we might expect more Audio Units from Mac OS coming to iOS. However, I want to emphasize for the non-coders here, that these will still be separate apps, which you have to buy from the app store. You won't be able to copy your Audio Units from your Mac to the iPad, because they have to be built for a different CPU architecture, among other reasons.
Audio Units on iOS can now, like on Mac OS, register parameters that can be recorded and modified in the host app. So the host can restore the state of the Audio Unit (which is definitely my favorite news today). The fact that the new API version supports key-value coding really comes in handy in that context.
The sending and reception of MIDI between host and client are done similarly to updating parameters. This might be the long overdue improvement on iOS for making apps communicate via MIDI without the hassle of selecting virtual ports.
Inter-app audio is not deprecated, it is more like a sub-set of the new API. Apple seems to have received enough feedback about the lack of parameter control and the annoyance of app switching, so that they decided to bring the whole Audio Unit infrastructure to iOS instead of only adding some little features here and there to IAA.
They also presented some screenshots showing how Audio Unit extensions will be handled in Apple's own apps, like Garageband: There will be a list of installed Audio Unit extensions (which come to your iPad or iPhone by installing the corresponding app from the app store). After selecting it, Garageband will show its UI above an on-screen keyboard. So Audio Unit extensions should not present their own keyboard. The fact that an AU's UI can be displayed in different container sizes shows how important it is now to use Auto Layout. However, they showed some concrete sizes with which developers of Audio Unit extensions should prepare. The slide says:
- iPad Air: 2048x670
- iPhone 6+: 2208x726
- iPhone 6: 1334x404
- iPhone 5s: 1336x350
Maybe this also says something about the devices that we can expect Audio Unit extensions to work on.
I can't say yet if two instances of an Audio Unit extension on iOS can be started. This would be even more exciting.