Skip to content

Instantly share code, notes, and snippets.

What would you like to do?
A rough draft for HMD direct mode on Wayland, v2
Goal: facilitate "Direct Mode" the best we can for HMD video output on Wayland
Direct Mode essentially means the desktop is not extended to the HMD, and it
provides the lowest possible latency to screen.
Specify a HMD-shell extension, somewhat similar to fullscreen-shell but
capable to co-exist with any other shell.
HMD-shell will cover only outputs that are HMDs. HMDs should not be exposed as
- enumerate HMDs (hotplug, unplug)
- provide HMD information (type, make, model, serial, etc. available via EDID)
- attach a wl_surface to a HMD, giving wl_surface the HMD role:
- buffers must be directly scanout-able, non-fatal error event
- sub-surfaces forbidden?
- buffer transform and scale?
- clarifications on frame callback specification
- enter/leave wl_output events not used
- Presentation extension fully functional except for sync output
identification (the sync output is always the HMD)
- status event: active, inactive (for multiplexing several VR apps in
the display server)
Application binds to the HMD-shell and receives a list of connected HMDs.
Application binds to a specific HMD (or multiple). HMD screen information is
delivered to the application. The application opens the sensors devices itself
and listens to them.
Or, if the HMD needs to be turned on via USB before it appears as a display:
Application binds the HMD-shell. Application opens all the sensor devices etc.
and tells the HMD to turn on. HMD-shell delivers a hotplug event and the
application starts using the new HMD output.
The application creates a wl_surface and attaches it to the HMD object. The
application allocates buffers and draws them according to the HMD screen
information, and follows the standard wl_surface operation sequence with
attach, commit, etc. The application can use the Presentation extension to get
accurate timing feedback.
Because the HMD-shell uses the standard wl_surface for content delivery, it
will work out of the box with EGL implementations and also Vulkan.
The display server is expected to decouple each HMD from any other outputs.
Buffers from the current active client shall be pageflipped to the HMD screen
ASAP, as there are no other clients or surfaces to be handled. Frame callbacks
are emitted as soon as the pageflip has been programmed. Presentation feedback
provides pageflip completion timestamps.
- Is it necessary to associate sensors in the display server, or can that be
done in the application?
* A: The display server should only provide some "name" for the HMD, and not
care at all about any sensor devices. Sensor devices will be handled
completely in the app-side toolkit.
- Should composited output be allowed as a fallback, with an warning event
that the application is not optimal?
- Should VR controllers be associated with the HMD already in the Wayland
display server, or can it be done by the application itself?
A: Not in the display server.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.