Research brings 'smart hands' closer to reality

Research brings ‘smart hands’ closer to reality
The SkinHaptics device sends ultrasound through the hand to precise points on the palm, paving the way for next-generation smart technology that uses your own skin as a touchscreen.

Using your skin as a touchscreen has been brought a step closer after UK scientists successfully created tactile sensations on the palm using ultrasound sent through the hand.

The University of Sussex-led study - funded by the Nokia Research Centre and the European Research Council – is the first to find a way for users to feel what they are doing when interacting with displays projected on their .

This solves one of the biggest challenges for who see the , particularly the hand, as the ideal display extension for the next generation of smartwatches and other smart devices.

Current ideas rely on vibrations or pins, which both need contact with the palm to work, interrupting the display.

However, this new innovation, called SkinHaptics, sends sensations to the palm from the other side of the hand, leaving the palm free to display the screen.

The device uses 'time-reversal' processing to send through the hand. This technique is effectively like ripples in water but in reverse – the waves become more targeted as they travel through the hand, ending at a precise point on the .

It draws on a rapidly growing field of technology called haptics, which is the science of applying touch sensation and control to interaction with computers and technology.

Professor Sriram Subramanian, who leads the research team at the University of Sussex, says that technologies will inevitably need to engage other senses, such as touch, as we enter what designers are calling an 'eye-free' age of technology.

He says: "Wearables are already big business and will only get bigger. But as we wear technology more, it gets smaller and we look at it less, and therefore multisensory capabilities become much more important.

"If you imagine you are on your bike and want to change the volume control on your smartwatch, the interaction space on the watch is very small. So companies are looking at how to extend this space to the hand of the user.

University of Sussex research brings 'smart hands' closer to reality
The SkinHaptics device sends ultrasound through the hand to precise points on the palm, paving the way for next-generation smart technology that uses your own skin as a touchscreen. Credit: Sri Subramanian / University of Sussex

"What we offer people is the ability to feel their actions when they are interacting with the hand."

The findings were presented at the IEEE Haptics Symposium 2016 in Philadelphia, USA, by the study's co-author Dr Daniel Spelmezan, a research assistant in the Interact Lab. The symposium concludes today (Monday 11 April 2016).

Citation: Research brings 'smart hands' closer to reality (2016, April 11) retrieved 28 March 2024 from https://phys.org/news/2016-04-smart-closer-reality.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

The future of TV? How feely-vision could tickle all our senses

273 shares

Feedback to editors