Skip to content

Instantly share code, notes, and snippets.

@ppaalanen
Created March 14, 2017 14:31
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ppaalanen/e0d2744ff71188e9a294726a37ca83c2 to your computer and use it in GitHub Desktop.
Save ppaalanen/e0d2744ff71188e9a294726a37ca83c2 to your computer and use it in GitHub Desktop.
A rough draft for HMD direct mode on Wayland, v2
Goal: facilitate "Direct Mode" the best we can for HMD video output on Wayland
Direct Mode essentially means the desktop is not extended to the HMD, and it
provides the lowest possible latency to screen.
Specify a HMD-shell extension, somewhat similar to fullscreen-shell but
capable to co-exist with any other shell.
HMD-shell will cover only outputs that are HMDs. HMDs should not be exposed as
wl_outputs.
HMD-shell:
- enumerate HMDs (hotplug, unplug)
- provide HMD information (type, make, model, serial, etc. available via EDID)
- attach a wl_surface to a HMD, giving wl_surface the HMD role:
- buffers must be directly scanout-able, non-fatal error event
otherwise
- sub-surfaces forbidden?
- buffer transform and scale?
- clarifications on frame callback specification
- enter/leave wl_output events not used
- Presentation extension fully functional except for sync output
identification (the sync output is always the HMD)
- status event: active, inactive (for multiplexing several VR apps in
the display server)
Operation:
Application binds to the HMD-shell and receives a list of connected HMDs.
Application binds to a specific HMD (or multiple). HMD screen information is
delivered to the application. The application opens the sensors devices itself
and listens to them.
Or, if the HMD needs to be turned on via USB before it appears as a display:
Application binds the HMD-shell. Application opens all the sensor devices etc.
and tells the HMD to turn on. HMD-shell delivers a hotplug event and the
application starts using the new HMD output.
The application creates a wl_surface and attaches it to the HMD object. The
application allocates buffers and draws them according to the HMD screen
information, and follows the standard wl_surface operation sequence with
attach, commit, etc. The application can use the Presentation extension to get
accurate timing feedback.
Because the HMD-shell uses the standard wl_surface for content delivery, it
will work out of the box with EGL implementations and also Vulkan.
The display server is expected to decouple each HMD from any other outputs.
Buffers from the current active client shall be pageflipped to the HMD screen
ASAP, as there are no other clients or surfaces to be handled. Frame callbacks
are emitted as soon as the pageflip has been programmed. Presentation feedback
provides pageflip completion timestamps.
Questions:
- Is it necessary to associate sensors in the display server, or can that be
done in the application?
* A: The display server should only provide some "name" for the HMD, and not
care at all about any sensor devices. Sensor devices will be handled
completely in the app-side toolkit.
- Should composited output be allowed as a fallback, with an warning event
that the application is not optimal?
- Should VR controllers be associated with the HMD already in the Wayland
display server, or can it be done by the application itself?
A: Not in the display server.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment