Skip to content

Instantly share code, notes, and snippets.

@mellii
Last active March 21, 2017 12:35
Show Gist options
  • Star 1 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mellii/d23b3de45490a5bf9b400dbee156bd34 to your computer and use it in GitHub Desktop.
Save mellii/d23b3de45490a5bf9b400dbee156bd34 to your computer and use it in GitHub Desktop.

Tobii EyeTracking SDK for Unity 5 - Manual

SDK version: 2.0 Beta (Compatibility: Unity 5.4, 5.3, and 5.2)

Switch to Scripting Reference

Good Choice

Thanks for choosing Tobii! You have invested your money and curiosity in an eye tracker device, let me now browse you through the steps to set you up to develop a whole new kind of game interactions using the player's eye-gaze.

Contents

System Requirements

Version 2.0 of the Tobii EyeTracking SDK for Unity requires the following setup on your computer:

  • Unity 5.4, 5.3 or 5.2
  • Windows 10, 8.1, or 7
  • Microsoft VC Redistributable 2012, 32 and 64 bit (can be downloaded from Microsoft, and is included in Visual Studio 2012)
  • Tobii Engine runtime (included in the install bundle for the eye tracker, or pre-installed on integrated systems)
  • Tobii consumer eye tracker peripheral or built-in device (check out integrated solutions from Acer, Alienware, and MSI, or the peripheral from Steelseries on their respective webpages, or Tobii's own peripheral (http://www.tobii.com/xperience/products/))

The code samples in the SDK are written in C#. It is also possible to write your scripts in UnityScript, if you prefer.

For download links and more information about other Tobii SDK's visit the Tobii Developer Zone (http://developer.tobii.com).

↑ Back to Top

Getting Started - Explore the Provided Demo Scenes

Follow these steps to get started exploring the demo scenes included in the SDK.

Step 1: Install and make sure your eye tracker is working

(For non-integrated eye trackers only) Follow the instructions included with your eye tracker to download and install the correct Tobii Engine runtime software. Make sure the eye tracker and software is working correctly.

Step 2: Download the Tobii EyeTracking SDK for Unity

If you haven't already done so, download the Tobii EyeTracking SDK for Unity from Tobii Developer Zone (http://developer.tobii.com). You need to be logged in to access the Downloads page. You can create an account for free, it only takes a couple of minutes.

Step 3: Import the Tobii EyeTracking Assets to a New Unity Project

Save the Tobii EyeTracking SDK for Unity zip file in an easy to remember place on disk. Extract the SDK zip and open the folder. In the root you will find a unitypackage which includes all the Tobii EyeTracking SDK assets and demo scenes.

Open Unity, and create a New Unity Project. Select Windows as Target Platform.

Import all assets from the Tobii EyeTracking SDK unitypackage

Step 4: Get to know the API and the eye-gaze data

Locate the Tobii folder in the Unity Editor Project panel. Expand it and find the DemoScenes folder.

Add the demo scenes to the build using the custom Tools menu (included in the DemoScenes folder): ToolsAdd Tobii SDK Demo Scenes to Build

Open the 01_EyeTrackingData scene and go through the three provided demo scenes to get an overview of the core features of the Tobii EyeTracking Framework for Unity.

↑ Back to Top

Integrating Tobii EyeTracking With Your Game

When you have explored the core features of the SDK, it is time to try it out in your game. If at any point things don't work as expected, check out the Troubleshooting section.

Import the Tobii EyeTracking Assets

Make sure Windows is the selected Target Platform in the build settings. Import the Tobii EyeTracking SDK for Unity unitypackage to your game (the DemoScenes folder is optional).

Implement features using the Tobii EyeTracking static API

To start using the Tobii EyeTracking API, add using Tobii.EyeTracking; to a script, type EyeTracking. and browse the static functions of the API. You might also want to explore the possibilities of the the built-in object selection using eye-gaze by adding the Gaze Aware component to some of your game objects. The following sections give some hints what to explore.

Configure Gaze Focus layers

If you want to use the SDK's built-in object selection using eye-gaze, you need to set up your environment for Gaze Focus detection.

Open the Eye Tracking Settings window from the Edit menu: EditEye Tracking Settings...

Click to open the Gaze Focus Layers multi-selection dropdown and make sure that all the layers are selected where you want Gaze Aware game objects to be focused using eye-gaze. Also make sure that layers you do not want to track with eye-gaze are not selected (like transparent layers).

Make game objects Gaze Aware

You can make a game object Gaze Aware by adding the Gaze Aware component: Add ComponentEye TrackingGaze Aware.

Read the HasGazeFocus property in the Update loop of a Gaze Aware game object and have it react to the user's eye-gaze by changing its behavior or appearance. Or, let your underlying game algorithms query which game object the user is looking at using EyeTracking.GetFocusedObject(), and have the game respond to the user's eye-gaze in some other way.

Note that Gaze Aware objects might not be a good fit for your particular game genre, game mechanics, or size and amount of game objects. But don't despair, there are still a lot to explore using the eye-gaze data directly.

Use Gaze Point data

Get the latest Gaze Point by calling EyeTracking.GetGazePoint(). This data can be used to implement one or more of the following eye-gaze interaction features in your game, for example:

  • Extended View (pans the camera a little when you look at the edges of the screen)
  • Infinite Screen (pans the camera continuously when you look towards the edges of the screen)
  • Clean UI (make UI elements semi-transparent if you don't look at them)
  • Aim@Gaze (aim your gun at the gaze point when you press an aim button)

In the Tobii Gaming Team, we have implemented and helped implement these features in a number of games (from triple A titles to indie games) and have made a number of iterations of improvements to the algorithms.

For your convenience, we have put together a separate package of Tobii Interaction Samples for Unity that will soon be made available, where the features can be explored in full implementation in 3D scenes from first and third person perspectives. These interaction sample scenes give you an opportunity to evaluate how well you think the different interaction features might work in your game.

Dive into more advanced features

You can read more about the API functions and features in the section Tobii EyeTracking Framework for Unity, where you also find information on how to make use of advanced features of the API.

Build for Standalone

The Tobii EyeTracking assets include an editor script for copying the correct plugin dll's to the build output folder. The EyeTracking features are currently only available for PC (Windows 10/8.1/7), but the framework compiles on Mac and Linux as well (without copying the dll's).

In addition to the Tobii client dll's included in the asset package, you need to make sure the Visual C++ Redistributable Package for Visual Studio 2012 (v110) is included in your game build, both the x86 and the x86_64 redists, since the Tobii client dll's depend on these redists. You can download an installer from Microsoft (linked page does not work well with Chrome, use another web browser): Visual C++ Redistributable for Visual Studio 2012

Tobii EyeTracking Framework for Unity

It might be tempting to skip this part of the documentation, but I promise you that it will save you time later to at least browse through it.

This section gives an overview of the eye tracking features available in the Tobii EyeTracking Framework for Unity. It introduces the core concepts and gives you insight into what tools and features the SDK has to offer you, and how they work.

For API details on each class and function, see the Tobii EyeTracking for Unity Scripting Reference.

GazePoint data

In the Tobii EyeTracking Framework a GazePoint is the data type representing the point on the screen where the user is looking. More technically it is the point on the screen where the eye tracker has calculated that a line along the user's eye-gaze intersects with the screen plane.

The GazePoint.Screen property returns a Vector2 coordinate in Unity Screen space. This might seem pretty much like getting a mouse pointer coordinate, but please do not make the mistake of using this point out-of-the-box as you would use a mouse pointer coordinate. Eye tracking data is not as precise as mouse pointer data - it is actually physically impossible due to how our eyes and seeing works. Instead, think of a series of GazePoints as representing an area where the user is looking, and take into account that the accuracy and precision of the data varies from user to user.

↑ Back to Top

EyeTracking static API

The recommended starting point for accessing gaze point data from the Tobii EyeTracking Framework is to use the static functions on the static class EyeTracking. Just include the line using Tobii.EyeTracking; in a script and you will have access to the static API functions.

Here are some examples of static methods on the class:

You can read more about focused object and user presence in the sections below.

Invalid data due to lazy initialization

The EyeTracking static API is implemented using so called 'lazy initialization'. This means that the API is not initialized until the first call to any of its functions. Because of this, the first time the functions are called they will return values that are invalid (IsValid is false). Depending on how many game loops it takes to initialize the underlying framework, there will be a number of frames where invalid data is returned.

The static API can be explicitly initialized using EyeTracking.Initialize(). This way the initialization period of invalid data can be moved to for example the startup sequence of the game instead of at the first request of data.

↑ Back to Top

Gaze Focus and the GazeAware component

The Tobii EyeTracking Framework has built-in support for mapping eye-gaze data to game-objects. We call this feature Gaze Focus. Rather than just mapping every gaze point to a game-object the idea is that an object that has Gaze Focus is intentionally focused by the user. To do this, we are under the hood using algorithms that take series of gaze points, the history of focused objects and timings in human perception into account. The 2.0 version of the SDK uses a first version of these algorithms, and the plan is that we will refine the algorithms continually in future versions of the framework. The goal is to have a fast and robust calculation that works for different eye tracker models with different characteristics in accuracy and precision, but also over a broad set of different eye-gaze characteristics of end-users.

The Gaze Focus system only maps game objects that are IGazeFocusable. The easiest way to make a game object gaze focusable is to add the GazeAware component to it. This component will register the game object at OnEnable and unregister it at OnDisable. In the game object's Update loop the GazeAware component's HasGazeFocus property can be read to know if the game object is focused or not.

For game algorithms outside of the individual game objects, it is also possible to ask the gaze focus handler which object is currently focused using the static API function EyeTracking.GetFocusedObject(), and have the game respond to this information. Only one object (or no object) is considered to have gaze focus at the time.

As a developer you should use Gaze Focus as it is out-of-the-box, since any additional filtering done on top of the gaze focus calculation might be a bad fit when the algorithms change in a future release. Highlighting of objects and visualisations should use timings related to human perception rather than be adapted to a specific gaze focus algorithm, eye tracker model or a specific user's characteristic gaze-tracking.

Gaze Focus is only available for 3D and 2D game objects. It does not work with UI elements.

↑ Back to Top

User Presence and Gaze Tracking states

The EyeTracking static API gives direct access to two states (or statuses): User Presence and Gaze Tracking.

User Presence is a state that indicates if a user is present in front of the eye tracked screen or not. This state can be used to for example pause some feature if there is no user present.

Gaze Tracking is a state that indicates whether the user's eye-gaze is currently tracked, in other words if the eye tracker is currently able to calculate the point on the screen where the user is looking.

↑ Back to Top

Advanced API features

This section gives an overview of some more advanced API features. While the EyeTracking static API uses basic data types and basic C# language structures, some of these more advanced API:s use C# language structures that are more familiar for experienced C# developers (like generics SomeClass<T> and using IEnumerable instead of arrays).

Custom Gaze Aware component

If you for some reason do not want to, or cannot use the supplied Gaze Aware component to make your game objects gaze focusable, it is also possible to create your own custom Gaze Aware component by implementing the IGazeFocusable interface. Just make sure to call the register and unregister focusable object functions on the gaze focus handler at appropriate times. See the GazeAware class implementation for details.

↑ Back to Top

Data Provider

When you have gotten comfortable using gaze point data you might realise that the EyeTracking.GetGazePoint() is too rudimentary for some of the things you want to implement. If you need better control over the gaze point data you can use an IDataProvider instead of the static API.

The following code example gives you an overview how to use a data provider and what data is offered. More information on each function is available in the Tobii EyeTracking Framework for Unity Scripting Reference.

using Tobii.EyeTracking;
using System.Collections.Generic;
...

public class ExampleClass : MonoBehaviour
{
    private IDataProvider<GazePoint> _gazePointProvider;
    private ITimestamped _lastHandledPoint;

    OnStart()
    {
        _gazePointProvider = EyeTrackingHost.GetInstance().GetGazePointDataProvider();
    }

    OnEnable()
    {
        _gazePointProvider.Start(gameObject.GetInstanceID());
    }

    OnDisable()
    {
        _gazePointProvider.Stop(gameObject.GetInstanceID());
    }
    
    Update()
    {
        GazePoint last = _gazePointProvider.Last; // the last data point received from the eye tracker
        GazePoint frameConsistent = _gazePointProvider.GetFrameConsistentDataPoint() // same as EyeTracking.GetGazePoint()
        // use last and frameConsistent data points
        ...

        IEnumerable<GazePoint> pointsSinceLastHandled = _gazePointProvider.GetDataPointsSince(_lastHandledPoint);
        foreach (point in pointsSinceLastHandled)
        {
            //handle each point that has arrived since previous Update()
        }
        _lastHandledPoint = pointsSinceLastHandled.Last();
    }
}

↑ Back to Top

System Configuration and States

The IEyeTrackingHost interface provides a number of additional features not available in the EyeTracking static API. This section highlights a couple of these features.

To access features on the IEyeTrackingHost you should always use EyeTrackingHost.GetInstance(). This method returns different implementations of IEyeTrackingHost depending on from where in the application life cycle the method is called, making sure we have consistent behavior and that no memory is leaked during application shutdown.

  • EyeTrackingHost.GetEngineAvailability() Use this method to check if there is a Tobii Engine runtime available or not. Useful query for games that build standalone for Linux and Mac, to block or not use Tobii EyeTracking features on these platforms.
  • IEyeTrackingHost.EyeTrackingDeviceStatus If this property returns a status of DeviceStatus.Tracking it means that the eye tracker is connected, up and running and trying to track the user's eyes. Any other status means that no eye tracking data can be expected - the eye tracker could for example not be connected, be initializing, or disabled by the user. Note that this status will always return an invalid value the first time it is called. A valid value will be returned within some frames after that first call.

For more information on the available functions and properties of the IEyeTrackingHost see the Tobii EyeTracking Framework for Unity scripting reference.

↑ Back to Top

Troubleshooting

If things are not working as you expected, please check out the following small troubleshooting guides.

"Tobii.EyeX.Client.dll could not be loaded"

Here are some things to try. Please try them in order:

  1. Make sure your build setting is for Standalone PC (Windows x86 or x86_64)
  2. Make sure you have Microsoft's VC Redist 2012 installed on your computer. The redist is included in Visual Studio 2012, but you can also download a separate installer from Microsoft: Visual C++ Redistributable for Visual Studio 2012.
  3. Check the bitness for Windows selected in the build settings (x86 or x86_64), go to File Explorer in Windows and manually copy the Tobii.EyeX.Client.dll from the corresponding Assets/Tobii/Plugins/x86 or ../x86_64 folder to the game project's root folder.

"System.IO.File does not contain a definition for ReadAllText"

Check your build settings and make sure Platform is set to "PC, Mac & Linux Standalone".

The Tobii EyeTracking SDK for Unity only provides gaze-data on Windows 10/8.1/7, and only builds without errors for standalone platforms (Windows, Mac and Linux).

"I get invalid data"

Getting invalid eye-gaze data is expected in the following situations:

  • during game startup,
  • at the first call for a value (for example first call to EyeTracking.GetGazePoint()),
  • for a few frames following the first call to a value function,
  • during game shutdown,
  • on unsupported platforms (like Mac and Linux)
  • when the Tobii EyeTracking Framework is not getting any data from the eye tracker:
    • when the eye tracker is not setup for the monitor it is attached to (check the Display Setup in the Tobii taskbar popup menu)
    • when the Tobii Engine is Disabled or not running, not configured, or Initializing (you might want to stop and restart the Tobii Engine application - select Quit in the task bar context menu and then type "EyeX" in the Start menu, select the installed engine runtime application to open and restart it)
    • when you are not looking at the screen
    • when your eyes are outside the so called track-box in front of the eye tracker (try moving your head closer/further away, vertically or horizontally until the eye tracker can track your eyes)

You have to take these expected cases of invalid gaze data into account in your game implementation.

"My Gaze Aware objects are not reacting to eye-gaze"

First of all, the Gaze Aware component only works with 3D and 2D game objects with a Collider. If you have attached the component to a UI element on a Canvas, it will not work. If you have attached the Gaze Aware component to a 3D and/or 2D game objects that has a Collider but it doesn't work, keep on reading.

The Gaze Focus algorithms only work when the Tobii EyeTracking Framework has access to valid Gaze Point data. See the previous section for situations when the Gaze Aware objects are not expected to react to eye-gaze (because there is no valid Gaze Point data available).

If none of the above situations apply, open the Eye Tracking Settings window: EditEye Tracking Settings... Verify that the proper layers are selected in the Gaze Focus Layers multi-selection dropdown. Only the selected layers will be considered in the Gaze Focus algorithms that calculate which Gaze Aware game object the user is looking at.

"The gaze points are offset to the upper-left"

If the gaze points are offset to the upper-left when running the game in the Unity Editor, and the offset increases the closer to the lower-right corner of your screen you are looking, then you probably have a DPI-problem. On Windows 10 this offset problem is more common since the screen DPI is automatically set to higher values for high-resolution monitors. There are two possible workarounds:

  • Set the DPI scale for your monitor manually to a value of 125% or lower, or
  • Build a standalone build and test the eye-tracking related features there

Further explanation of the DPI problem

The Unity Editor application is not DPI-aware, which means that Windows automatically scales the Unity Editor window if the screen DPI is set to a scale higher than 125%, and draws the window bigger on the screen without letting the Unity Editor know anything about this. This causes problems for us when we are trying to map a physical point in space where the user is looking, to a logical pixel on the virtual screen - because the values returned by Windows when we try to convert between physical and logical pixels do not return the coordinates where the pixels are actually drawn but where they would have been drawn if the application window had been drawn in its original 100% size. (This is so the application's internal logic will work, and mouse click coordinates and so on correspond to where the application believes it has drawn its UI elements). Unfortunately there is no API to ask Windows where it actually draws each pixel of the client application, so until Windows would provide such an API or Unity rewrites the Unity Editor to be DPI aware we are unfortunately stuck with the workarounds suggested above.

None of the above describes the issue I have

If you cannot find the answer to your problems here, please check out our Developer Forums (developer.tobii.com/community-forums/). Use the Forums' Search function to find existing answers, or post a new topic if your question has not been posted before.

Appendix A - Unity Editor Howto's

Set Windows as Target Platform

EditBuild Settings...

  • Platform: PC, Mac & Linux Standalone
  • Platform Target: Windows
  • Architecture: x86 or x86_64

Import the Tobii EyeTracking SDK unitypackage

AssetsImport AssetsCustom package...

  • Browse to the root folder of the Tobii EyeTracking SDK package you have downloaded and extracted to disk
  • Select the Tobii EyeTracking SDK unitypackage
  • Import all assets in the package

↑ Back to Top

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment