Affective Computing Input Kit

We think that inputs from the human body - like eye-tracking cameras and brain-computer interfaces - can be used to change the way that humans interact with machines. Input devices are in some ways the hardest part of getting AR/VR displays to beat PC/mobile for certain tasks.

Our open hardware kit aims to take inputs out of the lab and into the hands of developers. We have written a Unity plugin and Asteroid software to make sense of these inputs, but all the hardware connects via Bluetooth and USB through any application.

DzQY14mU0AAhT0I

Eye Tracker

We’re most proud of our custom-built, low-cost, open hardware eye tracker. You can purchase the eye-tracker by itself for $200. Eye-tracking is probably the most promising biosensor input method, since so much information about attention and intent can be inferred from the eyes.

Our eye-tracker uses two high-res (1280x720), 60fps global shutter cameras connected to a Raspberry Pi board. It includes a custom-designed face mount to hold the cameras underneath the eyes.

01

Our Raspberry Pi board will automatically track pupil positions, and send that data via Bluetooth to Asteroid, our Unity plugin, or any custom application. You can also directly access the eye images directly via USB, and run any custom image processing algorithm.

Gesture Sensor

To track hand motion in real-time, we've included a Flick gesture sensor in the dev kit. The Flick sensor works on the basis of electric field displacement. A set of electrodes in the sensor emits an electric field above the surface of the sensor, and an on-board chip detects changes in displacement in the electric field.

Both the Asteroid software and Unity plugin support automatic recognition of some basic gestures, but the raw data can be accessed through bluetooth as well.

image2

Scrubber/9DoF

Our handheld controller combines a 9DoF sensor and a soft linear potentiometer. The controller is especially suited to manipulate the position and rotation of 3D objects in space, as well as scroll through a range of values with your finger.

image12

This controller works out of the box with Asteroid and our Unity plugin, but the raw data can be accessed through Bluetooth as well.

Brain-Computer Interface

A brain-computer interface (BCI) measures electrical signals through four electrodes placed on the front and back of the head. If processed correctly, these signals can give surprisingly accurate data about mood and arousal. Both our Unity plugin and the Asteroid application include a pre-trained ML model to predict emotions from these signals.

image8

Our (BCI) uses an easily wearable headband containing 4 electrodes sends electrical signals from your brain into our custom board. Basic filtering and signal processing is done on-board before sending the data via Bluetooth. This data can then be run through a pre-trained ML model to guess the user's emotional state. We will be releasing the designs for the board once the design is finalized.

Conclusion

We put together this hardware kit internally because we wanted to test our software, but it turned out to be better than anything we could find off-the-shelf. Other BCIs and eye-trackers we found were either very expensive and intended for research use, or part of a closed gaming ecosystem. We hope that this kit will help developers start experimenting with novel HMI ideas right away.