This is Nth. version of AAP GUI extension specfication draft. Every time I visit this topic, I come up with different ideas over time.
Compared to other audio plugin formats, AAP has simplifying aspects and complicating aspects on GUI support: We only have to care about Android: there is no concern over multiplatform. It is similar to AUv3 caring only about iOS and macOS.
Though, we would like to be able to present the plugin UI on multiple methods:
- in host process, using WebView and per-plugin Web UI
- in plugin process View, by showing overlay window using WindowManager.
- in plugin process Activity, either by Activity Embedding for tablets and foldables (which could run with the host side by side), or in the worst case, switching to it (then OS may kill the host process at any time!)
We have to provide entry points to GUI on both Kotlin/Java and native. The native part is most likely done by JNI invocation, but native UI toolkits (like juce_gui_basics
) could make it complicated.
GUI instantiation is supposed to be asynchronous (the actual implementation is synchronous so far).
To support in-host-process GUI, a plugin UI must be cross process ready.
Like LV2 UI, a cross process ready AAP GUI must interact with the rest of the plugin program only via the limited channels. LV2 uses Atom sequences. We use MIDI2 UMP transports. Any complicated operations (e.g. instruction to open a file and get results from file dialog) will typically have to be implemented as System Exclusive messages or MDS (Mixed Data Set).
A cross process ready GUI interoperates with the plugin via the ports that are dedicated to UI, which typically are automatically populated channels: (1) a UMP input channel, and (2) a UMP output channel. There could be (3) optional sidechaining audio input channels for audio rendering (which would be expensive; it would always need memcpy()
among audio inputs -or- outputs to GUI inputs, handled by DSP).
Protocol wise, in-plugin-process UI has no limitation on interaction between the UI and the rest (DSP, file access, etc.). AAP does not impose particular constraints on in-plugin-process UI (just like CLAP does not impose anything either).
While there is no constraints on interaction between DSP and in-plugin-process UI, it should be noted how typical plugins would deal with GUI inputs to DSP and reflect DSP notification outputs to GUI. It applies to both in-process GUI and cross-process GUI. If you do not follow the practice, your audio processing could be "inconsistent" (may drop some messages, may lock the audio thread, etc.).
A DSP should have only one event input sequence (there is good explanation on why, at CLAP website) . Since typical DAW sequencer sends events via the playback engine, it will have to merge the sequencer inputs and the GUI inputs. To make it work, the DSP will have to hold an input queue for the GUI where GUI inputs are stored with timestamps (usually the timestamps would not matter though, as audio processing happen in very short duration like 10-20 msec.), and the processor will have to "unify" them, in timestamped order. Then the resulting queue is copied to the DSP event input sequence.
Since no locking should happen in ALL of those queues, the insertion operation from GUI inputs to GUI input queue has to be atomic. Copying DSP output events to GUI itself does not have to be atomic, but since we have to avoid the situation that the same buffer gets overwritten by the next audio cycle, it should be atomically copied to GUI output processing queue within the audio processing cycle.
They could be part of the framework. But how to perform these tasks in optimal way is up to each app, so it is not part of implementation. We would offer some reference implementation though.
The web UI zip archive must be provided as content://${applicationId}.aap_zip_provider/org.androidaudioplugin.ui.web/web-ui.zip
(fixed so far). The web content should be served by each plugin service, per request by client with pluginId
in the future.
Since it is a communication between the host and the plugin, the API will have to be stable. But it should only affect GUI instantiation and interoperability so it's somewhat less important than AudioPluginService AIDL compatibility.
AAPInterop
object is registered as a global object. Its members are as follows so far (types are in Kotlin):
- Logging
log(s: String)
: dispatches the log string to Web UI host.
- View controllers
onInitialize()
: called when the Web UI is initialized by hostonCleanup()
: called when the Web UI is being cleaned uponShow()
: called when the Web UI is shownonHide()
: called when the Web UI is being hidden
- DSP controllers
writeMidi(data: ByteArray)
: tells the Web UI host to write the MIDI message to the plugin instancesetParameter(parameterId: Int, value: Double)
: tells the Web UI host to set parameter valuewrite(port: Int, data: ByteArray, offset: Int, length: Int)
: tells the Web UI host to write the buffer content to port
- Plugin information retrieval
getPortCount()
: returns the number of port countgetPort(index: Int) : JsPortInformation
: returns the port informationgetParameterCount()
: returns the number of parameter countgetParameter(index: Int) : JsParameterInformation
: returns the parameter information
(There should be more members, especially for retrieving port buffer content.)
JsPortInformation
has the following members:
getIndex(): Int
: returns the port indexgetName() : String
: returns the port namegetContent() : Int
: returns the content type (General = 0 / Audio = 1 / MIDI2 = 3)getDirection() : Int
: returns the port direction (Input = 0 / Output = 1)
JsParameterInformation
has the following members:
getId(): Int
: returns the parameter IDgetName() : String
: returns the parameter namegetMinValue(): Float
: returns the minimum valuegetMaxValue(): Float
: returns the maximum valuegetDefaultValue(): Float
: returns the default value
In-plugin-process View is useful if overlay window is feasible.
Every in-plugin-process View must be derived from AudioPluginView
so that it can handle interoperability with the plugin host. The actual AudioPluginView
is returned by AudioPluginViewFactory.createView(pluginId: String)
. Each plugin declares a GUI factory which must be derived from this AudioPluginViewFactory
class. createView()
is an abstract method. The factory class is described in aap_metadata.xml
.
The host will create a hosting View that attaches the plugin's AudioPluginView
, and then is attached to the WindowManager
. This AudioPluginView
class has these methods:
- Logging
log(s: String)
: dispatches the log string to the UI host.
- View controllers
onInitialize()
: it is invoked whenever it is attached to theWindowManager
.onCleanup()
: it is invoked whenever it is detached from theWindowManager
.onShow()
: it is invoked whenever the hosting overlay window is shown.onHide()
: it is invoked whenever the hosting overlay window is being hidden.
Unlike Web UI protocol, we don't need DSP controllers as it is basically a matter of the plugin application itself (there is no interaction between host and process).
The host will instantiate the plugin's View via GUI extension: aap_gui_extension_t
because it is the only known channel between the host and the client so far. It contains the function members below:
aap_gui_instance_id create(AndroidAudioPluginExtensionTarget target, const char* pluginId, int32_t pluginInstanceId, void* audioPluginView)
- returns > 0 for a new GUI instance ID or <0 for error code e.g. already instantiated or no GUI found. The actual instantiation is asynchronous.int32_t show(AndroidAudioPluginExtensionTarget target, aap_gui_instance_id guiInstanceId)
- shows the view (by usingWindowManager.addView()
)void hide(AndroidAudioPluginExtensionTarget target, aap_gui_instance_id guiInstanceId)
- hides the view (byWindowManager.removeView()
)void resize(AndroidAudioPluginExtensionTarget target, aap_gui_instance_id guiInstanceId, int32_t width, int32_t height)
- resizes the View (by usingWindowManager.updateViewLayout()
.MATCH_PARENT
andWRAP_CONTENT
could be used as well.int32_t destroy(AndroidAudioPluginExtensionTarget target, aap_gui_instance_id guiInstanceId)
These extension functions are however not necessarily implemented by the plugin. The plugin, or the GUI AAPXS will behave totally differently:
(1) When the AudioPluginService
instance received create()
request, it will not go into native extension implementation (similar to get_mapping_policy()
in midi
extension). It looks for the AudioPluginGuiFactory
, instantiate, and create AudioPluginView
as explained above.
(2) When AudioPluginView
is instantiated, it calls into JNI code that invokes the plugin's native create()
function. It may populate the content plugin View
to add to audioPluginView
jobject. It can be omitted, then createView()
is supposed to do all work.
Since it involves Android View, it will come back to JavaVM side through JNI code in most cases. But the UI requests will have to go into the native GUI extension in any case, for unified entrypoint.
A typical GUI extension create()
implementation would instantiate the View via some JNI call.