Skip to content

Instantly share code, notes, and snippets.

@ppaalanen
Created February 22, 2017 11:09
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save ppaalanen/72cb586e0d61cb76360f96b7f8c11c60 to your computer and use it in GitHub Desktop.
Save ppaalanen/72cb586e0d61cb76360f96b7f8c11c60 to your computer and use it in GitHub Desktop.
A rough idea for Wayland HMD shell
Goal: facilitate "Direct Mode" the best we can for HMD video output on Wayland
Direct Mode essentially means the desktop is not extended to the HMD, and it
provides the lowest possible latency to screen.
Specify a HMD-shell extension, somewhat similar to fullscreen-shell but
capable to co-exist with any other shell.
HMD-shell will cover only outputs that are HMDs. HMDs should not be exposed as
wl_outputs.
HMD-shell:
- enumerate HMDs (hotplug, unplug)
- provide HMD information (type, make, model, serial, etc. available via EDID)
- associated sensors:
- sensor type: evdev device (gyro, IMU, buttons, etc.), v4l2 device
(cameras), ...
- device: device node path for the client to open
- attach a wl_surface to a HMD, giving wl_surface the HMD role:
- buffers must be directly scanout-able, non-fatal error event
otherwise
- sub-surfaces forbidden?
- buffer transform and scale?
- clarifications on frame callback specification
- enter/leave wl_output events not used
- Presentation extension fully functional except for sync output
identification (the sync output is always the HMD)
- status event: active, inactive (for multiplexing several VR apps in
the display server)
Operation:
Application binds to the HMD-shell and receives a list of connected HMDs.
Application binds to a specific HMD (or multiple). HMD screen information and
associated sensors devices are delivered to the application. The application
opens the sensors devices itself and listens to them.
The application creates a wl_surface and attaches it to the HMD object. The
application allocates buffers and draws them according to the HMD screen
information, and follows the standard wl_surface operation sequence with
attach, commit, etc. The application can use the Presentation extension to get
accurate timing feedback.
Because the HMD-shell uses the standard wl_surface for content delivery, it
will work out of the box with EGL implementations and also Vulkan.
The display server is expected to decouple each HMD from any other outputs.
Buffers from the current active client shall be pageflipped to the HMD screen
ASAP, as there are no other clients or surfaces to be handled. Frame callbacks
are emitted as soon as the pageflip has been programmed. Presentation feedback
provides pageflip completion timestamps.
Questions:
- Is it necessary to associate sensors in the display server, or can that be
done in the application?
- Should composited output be allowed as a fallback, with an warning event
that the application is not optimal?
- Should VR controllers be associated with the HMD already in the Wayland
display server, or can it be done by the application itself?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment