Skip to content

Instantly share code, notes, and snippets.

@mikelgg93
Last active September 23, 2022 11:10
Show Gist options
  • Save mikelgg93/047c1885943a2f2d975af69a62e14b19 to your computer and use it in GitHub Desktop.
Save mikelgg93/047c1885943a2f2d975af69a62e14b19 to your computer and use it in GitHub Desktop.
How to capture Steam VR content and use it with Pupil Capture.

This instructions will guide you on how to use Pupil VR Add-on with 3rd party content in Steam VR. Such as playing a game, for which you won't have access to the camera feed and properties.

1. Record the VR content using an external tool:

To capture the scene camera feed without having to develop anything, there are 3rd party plugins for OBS, like this one or this, that allows you to capture your VR content through the OpenVR API.

Please note that there is no OpenXR solution yet.

2. Recording with the eye tracker using Pupil Core Software:

You will need to record the data without calibration using Pupil Capture. While doing so, at the beginning, you will need to ask the user/participant, to look at salient stimuli objects on the game content, such as edges, signs, etc, which would later be used as reference points for calibration.

It would be helpful if you annotate events such as beggining and end of looking at one point. This will help you on step 3.2 to define temporal offset, or to select these points in section 4.

3. Loading the video onto Pupil Player:

  1. First, we will need to make our recorded video compatible with Pupil Player, for that use this code.

  2. Then you will need to use the following plugin in Pupil Player to load the video and align it for any temporal offset.

Check how to install custom plugins here. Once installed, navigate to the plugin menu and check the 'Temporal offset' field. You will need manually adjust the value to compensate the time difference between Pupil Capture and OBS recording starts.

4. Perform a gaze calibration post-hoc:

Load the reccording into Pupil Player as usual.

  1. Once the video is loaded properly in Pupil Player, you will need to perform a post-hoc calibration.

  2. To do so, you need to open the Gaze Data menu and change the data source to Post-Hoc Gaze Calibration.

  3. Click on reference locations and toggle on the manual edit mode, then select the points used in step 2 on the video.

  4. With all the reference points selected, click on new calibration. You will need to change the gaze mapping from 3D to 2D. Then click on calculate.

  5. Create a new Gaze Mapper using the calibration from step 4.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment