Researchers bring 'smart hands' closer to reality

Current ideas rely on vibrations or pins, which both need contact with the palm to work, interrupting the display.

Update: 2016-04-12 12:10 GMT
You may be soon able to use your skin as a touchscreen. (Representational image)

London: You may be soon able to use your skin as a touchscreen, as researchers including one of Indian-origin have successfully created tactile sensations on the palm using ultrasound sent through the hand.

The study by scientists from University of Sussex in the UK is the first to find a way for users to feel what they are doing when interacting with displays projected on their hand.

This solves one of the biggest challenges for technology companies who see the human body, particularly the hand, as the ideal display extension for the next generation of smartwatches and other smart devices, researchers said.

Current ideas rely on vibrations or pins, which both need contact with the palm to work, interrupting the display. This new innovation, called SkinHaptics, sends sensations to the palm from the other side of the hand, leaving the palm free to display the screen.

The device uses 'time-reversal' processing to send ultrasound waves through the hand. This technique is effectively like ripples in water but in reverse - the waves become more targeted as they travel through the hand, ending at a precise point on the palm, researchers said.

It draws on a rapidly growing field of technology called haptics, which is the science of applying touch sensation and control to interaction with computers and technology, they said.

According to Sriram Subramanian from University of Sussex who led the study, technologies will inevitably need to engage other senses, such as touch, as we enter what designers are calling an 'eye-free' age of technology.

"Wearables are already big business and will only get bigger. But as we wear technology more, it gets smaller and we look at it less, and therefore multisensory capabilities become much more important," Subramanian said.

"If you imagine you are on your bike and want to change the volume control on your smartwatch, the interaction space on the watch is very small. So companies are looking at how to extend this space to the hand of the user," he said.

"What we offer people is the ability to feel their actions when they are interacting with the hand," he added.

Similar News

Cancel the noise