Skip to content

Instantly share code, notes, and snippets.

@FremyCompany
Last active August 29, 2015 14:17
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save FremyCompany/a70ee6a02d54c3d9521d to your computer and use it in GitHub Desktop.
Save FremyCompany/a70ee6a02d54c3d9521d to your computer and use it in GitHub Desktop.
"Hooking into InputDevice's Raw Input" (proposal)

At the time of writing, the input devices landscape is fragmented, very fragmented. Beside a few common concepts inherited from the very first input devices (mouses and keyboards), it's impossible to get out of input devices the raw input they collect and make something useful out of it.

Inspirations

The aim of this proposal is to philosophically draw from the Pointer Events specification and unify a large range of input devices by providing a new set of fundamental concepts beside the ones inherited from the mouse pointer concept itself.

As such, the proposal draws a lot from the GamePad API, which aims at providing a device interface that works well across a range of gamepads, while still allowing for some customizability.

It also draws from the PointerLock API in the sense it aims at making it easier to disable the default "interpretation" of raw input into "simpler" concepts and allow the application to take advantage of this raw input directly.

Concepts

The basic idea of my proposal is that an Input Device can be divided into a set of individual sensors which I call parts. The main idea is that exposing the internals of a device (via device reflection) is the fist step into allowing apps to make use of the input devices specificities.

Let's take some examples:

  • A keyboard can be devided into combinable keys (Ctrl, Alt, Shift, and InputKeys)
  • A typical mouse into a 2D slider (the mousemove deltas), a 1D slider (the mouse wheel), and two buttons (left and right).
  • A typical trackpad will feature a touch surface and maybe a single click button under the whole surface.

However, some devices which are still widely recognized as mouses features many more buttons, but those often have to be mapped to keyboard keys by custom OEM drivers to be recognized in most programs. This is a problem I would like to tackle.

Proposed input device parts

Alright, now that we know a single device can be divided into parts, we have to decide how many different kinds of parts we want to expose. I currently divided those parts into three categories: Buttons, Sliders and Surfaces.

NOTE:

The interested reader will discover in attachment of this email a set of (interesting!) input devices (I've owned or used) divided into parts based on my proposal; if you don't understand everything I explain in the following sections, it's a great way to get a better picture of my proposal!

BUTTON:

A button is basically a 0D device. It's either pressed or not, possibly with a pressure associated to the press.

To accomodate the possible buttons types, a "buttonId" exists on the button mapping to either its mouse button index, its keyboard key code or a gamepad button id. The scope of the button id can be disambiguated by device type.

A button can be mapped to two kinds of DOM Events by default:

  • MouseButton events (attached to a persistent pointer like the mouse pointer)
  • KeyboardKey events (attached to the deivce's unififed virtual keyboard)

It's always possible to enable/disable/remap the DOM behavior attahed to a button.

SLIDER:

A slider is any device which can output a 1D/2D/3D value stream, like a mouse wheel, a mouse move tracker (2D or 3D) but also a gamepad or a volume slider in a DJ hardware set.

In this case I draw a distinction between integral (position) and non-integral (speed) devices. An example of the former would a volume slider; an example of the latter would be a mouse wheel.

A button can be mapped to two kinds of DOM Events by default:

  • MouseMove events (attached to a persistent pointer like the mouse pointer)
  • MouseWheel events (attached to a persistent pointer like the mouse pointer)

But I think it would be interesting to support other events like:

  • ZoomWheel events (attached to a persistent pointer like the mouse pointer)
  • RotationWheel events (attached to a persistent pointer like the mouse pointer)

SURFACE:

A surface is any device which an output one or more bitmap streams related to a (physical or virtual surface). A multitouch touchpad, a webcame or a dephtcam would fit in this category.

Usually, a surface device has (or at least may have) a strategy to interpret its bitmap streams to provide a set of contacts and contacts properties. Conceptually the contact API is similar to the Touch Events API in that it provides all contacts at once.

However, to the contrary of the Touch API, this isn't a DOM API and the "contacts" are not mapped to the document itself. For instance, a multitouch trackpad could report contacts, but those contacts are not even mapped to the screen.

A surface can be mapped to both two kinds of DOM Events:

  • Touch Events (if the device is mapped to the screen, by default or as defined by the application)
  • Pointer Events (if the contacts are considered pointers and should behave as such)

AND MORE:

In the future, the API may be accomodated for other devices sensors like 3D Skeleton Trackers and Point Cloud devices, but this proposal doesn't attempt to cover those, mainly because I lack experience with such devices, and also because I don't think those are sufficiently standardized to provide a good generic sensor description.

EMULATED DEVICES

Buttons, sliders and surfaces cohabit in real-life devices but, in addition, some devices exhibit virtual buttons and sliders as a result of an emulation process for compatibility purpose. A touchpad, for instance, will usually emulate a set of sliders (mouse move, mouse wheel, zoom, rotation, ...) and buttons (left click, right click) based on a surface-based mutltitouch input.

The idea of my proposal would be to expose such emulated device parts, to allow apps which do not care about the internals of the input device to handle the general features only. Some compelx devices may even provide multiple cascaded levels of emulation, and individual apps may hook to the right precision level for them, stopping the emulation at that level.

The idea, indeed, would be that it should be possible to override this default emulation behavior at the input device level, kinda like you can disconnect a mouse from the mouse pointer in PointerLock.

THREAD MODEL

On the meta-API side of things, the basic idea of my proposal would be to make sure InputDevice events are fired in another thread than the main thread, to decouple input analysis from actual computations made by the app.

As a result, access to input devices can be gated by a user consent (like the PointerLock/GamePad API) though such InputDeviceAccess permission encompass multiple previously separated permissions.

Access to the raw device data may be restricted to one application at a time, and apps may need to support some kind of "ondisconnected" event.

/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// An input device may be divided in multiple parts; a mouse has buttons and a 2D slider -- a trackpad may have a touch surface and buttons
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
class InputDevice {
buttons: Array<InputDeviceButton> // all the buttons of the device
sliders: Array<InputDeviceSlider> // all the sliders of the device
surfaces: Array<InputDeviceSurface> // all the surfaces of the device
parts: Array<InputDevicePart> // all the previously included devices (and more if more types are added in later revisions of the spec)
isEmulated: boolean // returns true if the device is as a whole a mupped of another device (like a keyboard is emulated by an onscreen-visual)
}
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// Each part of an input device is attached to its parent device, and may emulate or be emulated by other parts
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
class InputDevicePart {
// the device in which the part is defined
parentDevice: InputDevice
// returns true if the device is auto-generated from another device (like a mouse is emulated by a trackpad)
isEmulated: boolean
emulationControler: InputDevicePart // if known, the device part which controls this device as a muppet
// the devices which are emulated from this one
muppetDevices: Array<InputDevicePart>
}
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// devices mappable to multitouch devices
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
interface ContactCompatibleInputDevice {
//--------------------------------
// device properties
//--------------------------------
canFireContactEvents: boolean // some devices cannot generate contacts, like a Kinect 1 or a webcam
contactsSupportPressure: boolean // some properties about the contacts which are going to be generated by this device
contactsSupportAngle: boolean // (see PointerEvents for more info and properties)
contactsSupportHover: boolean // (...)
// etc...
surfaceWidth: double // the width of the surface, as reported in the contact events
surfaceHeight: double // the height of the surface, as reported in the contact events
mappedScreenSurface: Rect<double> // if the device is mapped to the screen, which region it is mapped to (otherwhise: null)
//--------------------------------
// device events
//--------------------------------
isContactEnabledByDefault: boolean
@Writable isContactEnabled: boolean // some devices which can generate contact may be switched to bitmap only for performance reasons
oncontactsadded: Function<ContactCompatibleInputDeviceEvent> // similar to "touchstart"
oncontactsremoved: Function<ContactCompatibleInputDeviceEvent> // similar to "touchend"
oncontactsupdated: Function<ContactCompatibleInputDeviceEvent> // similar to "touchmove"
//--------------------------------
// device state
//--------------------------------
lastContacts: Array<InputDeviceContact> // values as cached from the last fired InputDeviceSurfaceEvent
//-------------------------------
// dom events generation
//-------------------------------
firesTouchEventsByDefault: boolean
firesPointerEventsByDefault: boolean
@Writable firesTouchEvents: boolean // even if the devices can or does fires contact events, some may not be interested in mapping those to touch events
@Writable firesPointerEvents: boolean // even if the devices can or does generates contact events, some may not be interested in mapping those to pointers events
}
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// devices mappable to a mouse wheel
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
interface MouseWheelCompatibleInputDevice {
//--------------------------------
// device properties (stable)
//--------------------------------
canFireMouseWheelEvents: boolean // probably true anyway (we can use any slider as a wheel by considering its value as a delta (or rotation speed) each time, can't we?)
//-------------------------------------
// device properties (stable)
//-------------------------------------
dimension: int // equal to 1 for (x) sliders like mousewheel, 2 for (x,y) sliders like mouses, 3 for (x,y,z) sliders like 3d mouses
//-------------------------------------
// device state (transient)
//-------------------------------------
x: double // the first value of the slider/wheel // mouse wheels usually have only one dimension
y: double // the second value of the slider/wheel, if any (orelse 0); // mouse-move wheels have two dimensions
z: double // the third value of the slider, if any (orelse 0); // 3d-mouse-move wheels have three dimensions
//-------------------------------------
// device properties (continued)
//-------------------------------------
minX: double // the minimal value for the slider x-value, of -Infty
maxX: double // the minimal value for the slider y-value, of +Infty
stepX: double // the minimal increment for the x-value (or 0 if no guarantee)
// ditto for y and z
//--------------------------------
// device events
//--------------------------------
onupdated: Function<MouseWheelCompatibleInputDeviceEvent>
//-------------------------------
// dom events generation
//-------------------------------
firesMouseWheelEventsByDefault: boolean
@Writable firesMouseWheelEvents: boolean
// true if the device is used to generate wheel events (the default value can usually be overwritten)
// NOTE: a device which is not reset between changes will work like the emulated thumb wheel generated using the mouse position on wheel click
// NOTE: some devices may switch modes for real (like a mouse after PointerLock)
@Writable mouseEventsPointer: PersistentPointer // see later (the pointer for which the wheel events will be generated)
}
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// devices mapped to a mouse-move wheel
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
interface MouseMoveCompatibleInputDevice extends MouseWheelCompatibleInputDevice {
//--------------------------------
// device properties (stable)
//--------------------------------
canFireMouseMoveEvents: boolean // probably true anyway (we can use any slider as a wheel by considering its value as a delta (or rotation speed) each time, can't we?)
//-------------------------------------
// device properties (stable)
//-------------------------------------
xDevicePixelRatioByDefault: boolean // if x is expressed relatively to the device, by which factor % css pixels (normal zoom window)
@Writable xDevicePixelRatio: double // if x is expressed relatively to the device, by which factor % css pixels (normal zoom window)
xScreenZoneMin: double // relatively to the screen regions, which part of the screen is mapped to the device: usually minX (touch screens), or NaN (trackpads/mouses) as the pointer can sense moves outside the acceptable area
xScreenZoneMax: double // relatively to the screen regions, which part of the screen is mapped to the device: usually maxX (touch screens), or NaN (trackpads/mouses) as the pointer can sense moves outside the acceptable area
// ditto for y and z
//-------------------------------------
// dom events generation
//-------------------------------------
firesMouseMoveEventsByDefault: boolean // NOTE: mutually exlusive with firesMouseWheelEventsByDefault
@Writable firesMouseMoveEvents: boolean // NOTE: mutually exlusive with firesMouseWheelEvents
@Writable mouseEventsPointer: PersistentPointer // see later (the pointer for which the wheel events will be generated)
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// For a laptop/desktop, this represent a pointer which can be controlled by multiple devices simultaneously (aka the mouse pointer on Windows)
// I envision cases where your app may want to map some devices to another virtual pointer than the 'normal' mouse (for instance, an in-app virtual trackpad to emulate a pen)
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
class PersistentPointer {
id: int32 // the PointerEvent pointerId associated to events for this virtual pointer
type: string // (usually 'mouse')
screenX: double // the current x position of the pointer relatively to the viewport
screenY: double // the current y position of the pointer relatively to the viewport
//...
}
}
interface KeyboardKeyCompatibleInputDevice {
//--------------------------------
// device properties (stable)
//--------------------------------
canFireKeyboardEvents: boolean
//---------------------------------
// device state (transient)
//---------------------------------
isDown: boolean // is currently down or not? (pressure > 0)
pressure: double // from 0.0 to 1.0 where 0.5 is normal
buttonId: int // a value returned that represent the state/type of the button, if any
buttonIdName: string // a friendly value for the state of the button, if any
// QUESTION: (if all input keys *may* map to one InputDeviceButton (as only one can be reported at the same time by the keyboard); then do those properties change when the button becomes down?)
//--------------------------------
// device events
//--------------------------------
ondown: Function<KeyboardKeyCompatibleInputDeviceEvent>
onup: Function<KeyboardKeyCompatibleInputDeviceEvent>
//-------------------------------
// dom events generation
//-------------------------------
@Writable firesKeyboardEvents: boolean
}
interface MouseButtonCompatibleInputDevice {
//--------------------------------
// device properties (stable)
//--------------------------------
canFireMouseButtonEvents: boolean
isDown: boolean // is currently down or not? (pressure > 0)
pressure: double // from 0.0 to 1.0 where 0.5 is normal
//--------------------------------
// device events
//--------------------------------
ondown: Function<KeyboardKeyCompatibleInputDeviceEvent>
onup: Function<KeyboardKeyCompatibleInputDeviceEvent>
//-------------------------------
// dom events generation
//-------------------------------
@Writable firesMouseButtonEvents: boolean
@Writable mouseEventsPointer: PersistentPointer
}
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// An input device which is mostly materialized as a 0D device (though he may support pressure)
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
class InputDeviceButton extends InputDevicePart {
name: string // for a mouse "MouseLeft", "MouseMiddle", "MouseRight", ...; for a keyboard: "Ctrl", "Alt", "Shift" or "InputKey" (or ...)
// a button is supposed to work like a keyboard key for most purposes
@implements KeyboardKeyCompatibleInputDevice;
@implements MouseButtonCompatibleInputDevice;
}
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// An input device which is mostly materialized as a vector in 1D, 2D or 3D space (like a slider, a mouse, or a 3d mouse)
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
class InputDeviceSlider extends InputDevicePart {
name: string // "MouseWheel" (or "2Axis-Pad" or "VolumeSlider" or ...)
//--------------------------------------
// device events
//--------------------------------------
onupdated: Function<InputDeviceSliderEvent> // each time the x,y,z values are changed for the device
isDeltaOnlyDevice: boolean // true if the device will be reset to 0 after each change event (like mouses, mouse wheels...)
// NOTE: in this case "onupdated" events will always fire by sequence of events (some value, maybe some updated value, then 0 on reset)
// and any event coming after another update (which is not 0) will actually 'replace' the previously sent value and not add to it
// This can be a real wheel device or could emulate one easily:
@implements MouseWheelCompatibleInputDevice;
// This can be a mouse-move wheel device or could emulate one easily once provided a pixelRatio
@implements MouseMoveCompatibleInputDevice
}
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// An input device which is best represented as one or more 2d maps (like a multitouch touchpad or a touchscreen)
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
class InputDeviceSurface extends InputDevicePart {
name: string // "Trackpad", "Touchscreen", "Webcam" or ...
canGengerateBitmapStreams: boolean // can the device output raw bitmaps?
availableBitmapStreams: Array<InputDeviceBitmapSource> // the raw data the surface device can output
// this could be mapped to an pointing device (touchpad, touchscreen, kinect v2, ...)
@implements ContactCompatibleInputDevice;
}
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// This is like a pointer. It's just exactly what you would expect in a PointerEvent. It's just not a pointer (yet)
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
class InputDeviceContact {
(see PointerEvent, really, except we talk about device coordinates here and not yet clientX etc (because trackpads are not mapped to the screen for instance)...)
x, y, pressure, hover, angle, ...
}
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
// Some devices (touchpads, touch mouses, kinects, etc...) can generate a stream of bitmaps which can be used as raw input
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
class InputDeviceBitmapSource {
name: string // the name of the source, like "Pressure" or "Depht" or ...
onupdated: Function<InputDeviceBitmapSourceEvent>
// for devices that map to the screen like touchscreens or trackballs
isMappedToTheScreen: boolean // true if the following values are known:
xDevicePixelRatio: double // if the bimap is mapped to the screen, by which factor % css pixels (normal zoom settings)
yDevicePixelRatio: double // if the bimap is mapped to the screen, by which factor % css pixels (normal zoom settings)
mappedScreenZone: Rect // relatively to the screens zone, which part of the region is mapped to the device
}
Generic Mouse:
'
A generic mouse can be described pretty straightfully using this API.
'
(Real device parts):
"Mouse Left Button":
buttonId: MOUSE_LEFT
canFireKeyboardEvents: false
canFireMouseEvents: true
"Mouse Right Button":
buttonId: MOUSE_RIGHT
canFireKeyboardEvents: false
canFireMouseEvents: true
"Mouse Middle Button":
buttonId: MOUSE_MIDDLE
canFireKeyboardEvents: false
canFireMouseEvents: true
"Mouse move 2-axis slider":
dimension: 2
firesWheelEventsByDefault: false
firesPointerMoveEventsByDefault: true
pointerId: DEFAULT_POINTER
minX: -Infty
minY: -Infty
maxX: +Infty
maxY: +Infty
stepX: 1.0
stepY: 1.0
xDevicePixelRatio: 1
yDevicePixelRatio: 1
xScreenMin: NaN
yScreenMin: NaN
xScreenMax: NaN
yScreenMax: NaN
"Mouse wheel 1-axis slider":
dimension: 2
firesWheelEventsByDefault: true
firesPointerMoveEventsByDefault: false
pointerId: DEFAULT_POINTER
minX: -60
minY: -60
maxX: +60
maxY: +60
stepX: 1
stepY: 1
xDevicePixelRatio: 1
yDevicePixelRatio: 1
xScreenMin: NaN
yScreenMin: NaN
xScreenMax: NaN
yScreenMax: NaN
Microsoft Touch Mouse:
'
The Microsoft Touch Mouse is physically described as a touch surface on the top of a mouse.
Like a touchpad, it only features one physical button and the finger pressure determines whether the user left- or right-clicked.
Like a touchpad, you can perform gesture on it, but the ones you may performed are different, as panning is mapped to scroll.
You can view a vide description of the device here:
https://www.youtube.com/watch?v=L0ZbnCfv_XI
You can view a more technical description of the device here:
http://www.hanselman.com/blog/AbusingTheMicrosoftResearchsTouchMouseSensorAPISDKWithAConsolebasedHeatmap.aspx
'
(Real device parts):
"Touch Surface":
surfaceWidth: 15
surfaceHeight: 13
mappedScreenSurface: null
availableBitmaps:
"Pressure Map"
isContactEnabled: true
firesContactEventsByDefault: false
contactsSupportPressure: true
contactsSupportAngle: false
contactsSupportHover: false
firesPointerEventsByDefault: false
firesTouchEventsByDefault: false
"Mutltiusage Mouse Button":
buttonId: (neither MOUSE_LEFT nor MOUSE_MIDDLE nor MOUSE_RIGHT)
canFireKeyboardEvents: false
canFireMouseEvents: false
(Emulated device parts):
"Mouse Left Button":
buttonId: MOUSE_LEFT
canFireKeyboardEvents: false
canFireMouseEvents: true
"Mouse Right Button":
buttonId: MOUSE_RIGHT
canFireKeyboardEvents: false
canFireMouseEvents: true
"Mouse move 2-axis slider":
dimension: 2
firesWheelEventsByDefault: false
firesPointerMoveEventsByDefault: true
pointerId: DEFAULT_POINTER
minX: -Infty
minY: -Infty
maxX: +Infty
maxY: +Infty
stepX: 0.5
stepY: 0.5
xDevicePixelRatio: 1
yDevicePixelRatio: 1
xScreenMin: NaN
yScreenMin: NaN
xScreenMax: NaN
yScreenMax: NaN
"Mouse wheel 2-axis Slider":
dimension: 2
firesWheelEventsByDefault: true
firesPointerMoveEventsByDefault: false
pointerId: DEFAULT_POINTER
minX: -60
minY: -60
maxX: +60
maxY: +60
stepX: 1
stepY: 1
xDevicePixelRatio: 1
yDevicePixelRatio: 1
xScreenMin: NaN
yScreenMin: NaN
xScreenMax: NaN
yScreenMax: NaN
"Flick-left gesture Button":
buttonId: (SOME_VALUE)
canFireKeyboardEvents: false
canFireMouseEvents: false
"Flick-right gesture Button":
buttonId: (SOME_VALUE)
canFireKeyboardEvents: false
canFireMouseEvents: false
"Two-Finger Flick-up gesture Button":
...
...
MacBookPro Touchpad:
'
A fairly hidden feature of the MBP Touchpad is its ability to function as a full touch interface.
Not only does the touchpad works fairly well to recognize gestures, but it can be "pointer-locked" to a raw touch surface.
In fact, I copy-pasted most of my touch-mouse description here, with only subtle details like the default behavior of devices being changed :-)
A funny skateboard game running on a MBP illustrating this feature:
http://notebooks.com/2011/08/10/touchgrind-brings-multitouch-gaming-to-the-mac/
Some code to try it out on your mac:
http://www.steike.com/code/multitouch/
'
(Real-devices):
"Touch Surface":
surfaceWidth: 1.0
surfaceHeight: 1.0
mappedScreenSurface: null
availableBitmaps:
"Pressure Map"
isContactEnabled: true
firesContactEventsByDefault: false
contactsSupportPressure: true
contactsSupportAngle: true
contactsSupportHover: false
firesPointerEventsByDefault: false // the emulated devices do
firesTouchEventsByDefault: false
"Mutltiusage Mouse Button":
buttonId: (neither MOUSE_LEFT nor MOUSE_MIDDLE nor MOUSE_RIGHT)
canFireKeyboardEvents: false
canFireMouseEvents: false
(Emulated device parts):
"Mouse Left Button":
buttonId: MOUSE_LEFT
canFireKeyboardEvents: false
canFireMouseEvents: true
"Mouse Right Button":
buttonId: MOUSE_RIGHT
canFireKeyboardEvents: false
canFireMouseEvents: true
"Mouse move 2-axis slider":
dimension: 2
firesWheelEventsByDefault: false
firesPointerMoveEventsByDefault: true
pointerId: DEFAULT_POINTER
minX: -Infty
minY: -Infty
maxX: +Infty
maxY: +Infty
stepX: 0.5
stepY: 0.5
xDevicePixelRatio: 1
yDevicePixelRatio: 1
xScreenMin: NaN
yScreenMin: NaN
xScreenMax: NaN
yScreenMax: NaN
"Mouse wheel 2-axis Slider":
dimension: 2
firesWheelEventsByDefault: true
firesPointerMoveEventsByDefault: false
pointerId: DEFAULT_POINTER
minX: -60
minY: -60
maxX: +60
maxY: +60
stepX: 1
stepY: 1
xDevicePixelRatio: 1
yDevicePixelRatio: 1
xScreenMin: NaN
yScreenMin: NaN
xScreenMax: NaN
yScreenMax: NaN
"Flick-left gesture Button":
buttonId: (SOME_VALUE)
canFireKeyboardEvents: false
canFireMouseEvents: false
"Flick-right gesture Button":
buttonId: (SOME_VALUE)
canFireKeyboardEvents: false
canFireMouseEvents: false
"Two-Finger Flick-up gesture Button":
...
...
Wii-based spotlight input device:
'
A camera tuned for a specific light frequency, and a light emitter
that you can use to "click" (bright flash) or "hover" (constant but less-intensive lighting)
by mapping a portion of the webcam field to the screen.
'
(Real devices):
"Frequence-tuned camera":
surfaceWidth: 640
surfaceHeight: 480
mappedScreenSurface: (primary screen region)
availableBitmaps:
"Webcamp-input"
isContactEnabled: true
firesContactEventsByDefault: true
contactsSupportPressure: false
contactsSupportAngle: false
contactsSupportHover: true //
firesPointerEventsByDefault: true
firesTouchEventsByDefault: true
@RByers
Copy link

RByers commented Apr 9, 2015

Note that Android MotionEvent and X11 XInput API also have some of these properties. Android MotionEvent is, in some sense, even more abstract in that things like 'x', 'y', 'major radius', 'minor radius', 'pressure' are all instances of an 'axis' (which is an enum). Then you can replace a ton of APIs like 'supportsPressure' with 'supportsAxis(AXIS_PRESSURE)'.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment