Remote Haptics and Mid-air Gestures
Interaction with mobile devices, including phones and wearables, is often limited by their small screens. A limited amount of space is available for displaying content and giving users feedback about their interactions, which can make introducing new interaction modalities difficult. Novel interactions, such as speech and in-air gestures, require good feedback but mobile devices are unable to give this feedback. We can overcome this problem by using alternative types of feedback, instead. For example, tactile feedback and visual feedback in the periphery of the device can be used to keep the limited screen space available for showing the interactive content.
In-air gesture interaction is a novel way of interacting with mobile phones and recent products have been released with these capabilities. However, limited feedback is given about these interactions, which makes them difficult to use. We investigated novel types of feedback about in-air gestures, so that feedback can be given without affecting on-screen content. This project was a three-year PhD studentship partly funded by Nokia Research Centre in Finland.
Tactile Feedback for In-Air Gestures
We investigated tactile feedback for in-air gesture interfaces. Giving such feedback is challenging for one obvious reason: users do not physically contact in-air gesture systems, meaning novel solutions are required to present tactile sensations. In this project we compared ultrasound haptics (see UltraTouch) to tactile feedback from wearable devices, like smart-watches and activity trackers. We also investigated feedback design, to see how the complexity of feedback affected interaction. For more information see here or our ICMI 2014 paper on tactile feedback for above-device interfaces.
Addressing In-Air Gesture Interfaces
To address an in-air gesture system, users need to know where to perform gestures and they need to know how to direct their input towards the system. We investigated novel interaction techniques to help users solve these problems. We evaluated the techniques individually and found them successful, then combined them, to create a single technique called Do That, There that shows users how to direct their input while also helping them find where to gesture. These interactions used wearable tactile, auditory and peripheral visual feedback to enable gesture interaction with screen-less, or small-screen, devices.
- Do That, There: An Interaction Technique for Addressing In-Air Gesture Systems. In Proceedings of the 34th Annual ACM Conference on Human Factors in Computing Systems – CHI ’16, ACM Press, pp. 2319-2331, 2016. Link. :
- Towards Mid-Air Haptic Widgets. In CHI 2016 Workshop on Mid-Air Haptics and Displays: Systems for Un-instrumented Mid-Air Interactions, 2016. Link. :
- Towards In-Air Gesture Control of Household Appliances with Limited Displays. In Proceedings of Interact ’15 Posters, Springer, 2015. Link. :
- Interactive Light Feedback: Illuminating Above-Device Gesture Interfaces. In Proceedings of Interact ’15 Demos, Springer, 2015. Link. :
- Illuminating Gesture Interfaces with Interactive Light Feedback. In Proceedings of NordiCHI ’14 Beyond the Switch Workshop, 2014. Link. :
- Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions. In Proceedings of International Conference on Multimodal Interaction – ICMI ’14, ACM Press, pp. 419-426, 2014. Link. :
- Towards Usable and Acceptable Above-Device Interactions. In Proceedings of Mobile HCI ’14 Posters, ACM Press, pp. 459-464, 2014. Link. :