Skinput Turns Human Body Into A Touch Screen Finger Input Interface
Imagine this – you are out on a morning jog with an MP3 player, listening to music and stretching your legs. What if you can simply tap your fingers to change the music, pause the song or play the next item in the playlist ?
Skinput makes it possible.
It works like this – menus from electronic components (e.g smart phones) are projected into your skin from a body worn unit. To operate your electronic device, all you have to do is tap different areas of your skin and activate the desired function.
The Technology Behind SkinPut And How The System Works
SkinPut’s technology allows the human body to be used as an input surface. The hardware device collects the body impacts using an array of sensors, worn as an armband.
In the following video, Chris Harrison talks about the technology behind Skinput and how a bio acoustic sensing technology allows human body to be used as a large finger input surface (which doesn’t require any electronic components to be present in the skin).
When a finger taps the human skin, the impact creates some useful acoustic signals (sounds). When an impact is produced on the human skin, it gives rises to longitudinal waves that expand through out the body .
To capture these body impact signals, the developers have created a special purpose Bio acoustic sensing device, as shown in the following image:
The software developed by SkinPut’s team listens to the body impacts and classifies them. Different interactive capabilities can be bound to different locations of the body, e.g hand,palm or fingers.
In the following example, a user is playing “Tetris” using his fingers as the control pad.
Another example of Skinput is where the system renders a series of buttons on the wearer’s arm. User’s can simply click the desired button with their fingers and perform a specific action.
This system is hierarchical, thus allowing the user to access a wide range of potential functionality. Finger inputs are segmented and classified in real time.
So the Skinput device works in the same fashion as current day smartphones, you can simply tap the fingers on your skin and play an MP3 player, call someone or maybe turn on your computer’s printer.
As it turn’s out, SkinPut’s technology can be used to develop a scrolling based interface, which works when the user taps at the top or bottom of the forearm to slide items of a selection. The selected item can be activated by tapping at the center of the forearm, and to go backwards in the interface hierarchy – users can perform a simple pinching gesture using their thumb and forefinger ( see part 4 of the above image)
SkinPut vs DepthJS
Some days back we told you about DepthJS – a chrome extension which allows users to navigate the web using hand gestures. The extension is at it’s early development stages but the developers prove that you can create a system which can understand human body signals and perform desirable actions e.g mouse clicks, opening windows, resizing and so on. The following video shows how DepthJS works with a web browser:
There are quite a few similarities as well as differences between DepthJS and SkinPut’s technology. The first similarity is that both of them uses human body behavior as system input. While SkinPut operates with basic electronic components, DepthJS is more geared towards computing activities.
SkinPut provides a naturally portable finger input system while DepthJS is more of a “remote controlled” mouse.
If you want to know more about Skinput’s tech and how the whole system works, be sure to check out this Microsoft research page and read this documentation (PDF). Thanks Paul
SkinPut has been developed by Chris Harrison ,Dan Morris and Desney Tan at Microsoft’s research lab in Redmond, Washington.