Microsoft device uses the skin as a touchscreen

Microsoft and Carnegie Mellon University are working on a gadget that turns the user’s arm into a touchscreen display.

The technology, dubbed Skinput, consists of a projector which creates a keyboard on the wearer’s arm or hand, and an armband containing sensors which picks up the vibrations when the user taps the ‘keys’. Commands are then transmitted to the device being controlled through a wireless link.

“This approach provides an always available, naturally portable, and on-body finger input system,” say the developers.

“Results from our experiments have shown that our system performs very well for a series of gestures, even when the body is in motion.”

There are two types of sensor: one which picks up a particular, low-frequency acoustic band corresponding to waves transmitted through the rippling of skin, and the other sensitive to higher frequencies to capture signals transmitted through bone.

They tried out various set-ups, including a hierarchical menu, a scrolling menu and a numeric keypad.

With ten different ‘keys’ to tap, the researchers found that the system had an accuracy of over 80 percent – much higher for fewer locations. It even worked well while jogging, but was a lot less efficient with the overweight.