Google Glass app identifies people by their clothes

The first – very creepy – third party app for Google Glass has been unveiled. InSight is designed to recognize people on the basis of their clothes.

Developed by researchers at Duke University, the app‘s partly funded by Google. The idea is to produce a ‘fashion fingerprint’ based on a user’s clothes, shoes, jewelry or glasses.

“For instance, Alice may look at people around her in a social gathering and see the names of each individual – like a virtual badge – suitably overlaid on her Google Glass display,” explain the developers.

“Where revealing names is undesirable, only a tweet message could be shared. People at the airport could tweet ‘looking to share a cab’, and Alice could view each individual’s tweet above their heads.”

First, the app acquires a series of photos of the user, by ‘peeking’ at them while they use their phone. These photos are then saved into a file called a ‘spatiogram’, and the person’s ‘fingerprint’, is broadcast to other smartphones nearby. When the person then enters the field of vision of a Google Glass wearer, an arrow hovers on-screen to identify them.

In tests using PivotHead cameras and Samsung Galaxy phones, say the developers, the system was able to identify 93 percent people from the front. Even when their backs were turned, combining different views – which did, though, slow things down – gave an accuracy of 96 percent.

The researchers acknowledge that there could be certain privacy problems, particularly if users forget that they’re being photographed. This may be one reason why the app uses clothing rather than facial recognition; changing your hat could make you nicely anonymous again.