Android imposes strong restriction on how audio plugin UIs should be architected:
- Only one application can show the UI. Only either the DAW or the plugin Activity will be shown to users.
- For an Android application (in this case host/DAW), it is impossible to execute some code from other applications (for this case, plugins).
Flipping application activities is quite annoying, especially when they involve audio glitches.
Although, on the other hand, most of existing portable plugin GUIs (e.g. written with juce_gui_basics), it is still better with flipping activities, than nothing.
We will provide both solutions.
There will be two kinds of GUI extensions for AAP:
- in-process GUI
- cross-process GUI
Cross-process GUI is based on Web technology (HTML/CSS/SVG/JS/wasm). In-process GUI can be anything.
Android apps cannot load external native code dynamically. Therefore it is impossible to run any GUI code that depends on native code. For example, Flutter, React Native, and Xamarin depends on native runtime. Ionic and React.js should be good to go. Flutter for Web, MAUI, and Jetpack Compose for Web would be still good to go too.
(AAP itself is not going to provide any hosting support for those runtimes. Applications are free to provide common GUI runtime to make it happen. But at this state we have no idea on how such extensibility can be usable. Those non-platform frameworks usually package the runtime within user application and there is usually no clean way to separate runtime provider to the public, other than dev. runtime.)
Like any GUI application framework, AAP audio processors and its UI part should not be mixed.
LV2 achieves this code separation in clean way: the audio processor module and UI module are different shared library. The interaction between these components is achieved by port I/O. We follow this manner for cross-process UI too. Note that we even need further different programming architecture for cross-process interaction between audio process and UI.
(It's still out of scope that a "UI host" instantiates some UI, and another audio plugin host instantiates the plugin (non-UI), and those 3 apps interacts together. That's unnecessarily complicated.)
An AAP can provide built-in UIs. Actually plugin Web UIs can be provided by anyone as AAP UI can interact only with the plugin API described later. A host can query plugin UIs in (TBD) way and then acquire Web UI, load it in its own WebView, then let it interact with its in-process plugin instances and Web UI that was loaded from another package.
There is a new metadata element with <plugin>
element in aap_metadata.xml
to describe the built-in UIs:
<ui web="Android_Intent_Name_Web" activity="Android_Intent_Name_Activity" />
Where both Android_Intent_Name_*
are app_package_name/content_provider_name
. For Android_Intent_Name_Web
, AAP host queries the service on the Android device (or emulator). For Android_Intent_Name_Activity
it would just launch the specified activity, which can be either within the same app or in another application.
The web content provider is then used to provide "GUI package" which is a set of HTML files packaged within a zip.
Both UI styles are optional; a plugin can support only Web, or only activity, or both (or none).
The GUI zip package contains index.html
which is loaded into an Android WebView
for the plugin. anything else is optional.
There is an Javascript interface called AAP Plugin Access API that is hooked by the AAP host WebView. It would be possible to provide some Polyfill for that interface for use in anywhere else.
The HTML must come with a global Javascript object called AAPHostInterop
. It is an Object
that comes with these function members:
onInitialize : Function()
: The host must invoke this function when it initializes the UI.onShow: Function()
: The host must invoke this function whenever it shows the plugin UI beforehand (possibly from the hidden state). It is invoked afterinitialize()
too.onHide: Function()
: The host must invoke this function whenever it hides (but not destry) the plugin UI afterward. It is invoked beforecleanup()
too.onCleanup: Function()
: The host must invoke this function when it disposes the UI.onNotify: Function(port: Number, data: UInt8Array, offset: Number, length: Number)
: The host invokes this function to notify the plugin about plugin/host events (e.g. parameter changes). It may run on different host thread than any other events listed above.listen(port: Number)
write(port: Number, data: UInt8Array, offset: Number, length: Number)
: user can send data to the specified port.
TODO: should we define NotificationEvent
instead of listing those arguments directly? (like DOM Events)
TODO: should we provide pluginId
and instanceId
as well? I don't see need for it so far.
A Web UI host (most likely the same as audio plugin host) interacts with the plugin service using binder based messaging.
There will be additional methods in AIDL to facilitate audio processing to and from UI (send event and get notification in non-audio threads).
TBD: needs concrete definition.
There are handful of GUI frameworks for audio plugins, either C/C++ or any other languages.
- Cross-platform UI solutions: ReactJS, Flutter for Web, Jetpack Compose for Web, MAUI can be options.
- NanoGUI
mod-ui is the closest solution to this idea.
I came up with similar solution called aria2web which imports SFZ ARIA extension to HTML5, but mod-ui could bring tighter integration with our LV2 backend.
There was juce_emscripten effort. It's been inactive, but still an option.
Yet, aap-juce still needs some way to separate UI and non-UI part, and inject audio processing part.