OrbTouch

This paper presents a machine learning approach to map outputs from an embedded array of sensors distributed throughout a deformable body to continuous and discrete virtual states, and its application to interpret human touch in soft interfaces. We integrate stretchable capacitors into a rubber membrane, and use a passive addressing scheme to probe sensor arrays in real-time. To process the signals from this array, we feed capacitor measurements into convolutional neural networks that classify and localize touch events on the interface. We implement this concept with a device called OrbTouch. To modularize the system, we use a supervised learning approach wherein a user defines a set of touch inputs and trains the interface by giving it examples; we demonstrate this by using OrbTouch to play the popular game Tetris. Our regression model localizes touches with mean test error of 0.09 mm, while our classifier recognizes gestures with a mean test accuracy of 98.8%. In a separate demonstration, we show that OrbTouch can discriminate between different users with a mean test accuracy of 97.6%. At test time, we feed the outputs of these models into a debouncing algorithm to provide a nearly error-free experience.

Credits: Larson et al. 2018

References

Larson, Chris; Spjut, Josef; Knepper, Ross; Shepherd, Robert. A Deformable Interface for Human Touch Recognition using Stretchable Carbon Nanotube Dielectric Elastomer Sensors and Deep Neural Networks. Eprint arXiv:1706.02542 [pdf]

Project Page

Chris Larson, Personal website

Leave a Reply

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s