The Levitate project is investigating a radical new way of interacting with data by using levitating particles to create a mid-air display. Users will be able see, hear and feel objects that levitate in front of them, without having to wear any other device.
The following image shows an example of acoustic levitation, using one of our prototype devices. It shows eight polystyrene beads at the corners of a cube, which are being levitated between two arrays of ultrasound transducers. By manipulating the acoustic field between the transducers, we are able to move the objects independently in three dimensions.
Our role on the Levitate project is to investigate new types of interaction with levitating objects. We have been exploring ways of providing input to systems consisting of levitating objects. We’ve also been thinking about how we can use levitating objects to present feedback and content to users. As part of this project we are also investigating mid-air ultrasound haptic feedback.
- Point-and-Shake: Selecting Levitating Objects (CHI 2018)
- Levitating Particle Displays (ACM Pervasive Displays 2018)
- Enhancing Physical Objects with Actuated Levitating Particles (ACM Pervasive Displays 2019)
- HaptiGlow: Positioning Hands for Optimal Mid-Air Interaction (IEEE World Haptics 2019)
- Apr 2019: Full paper accepted for IEEE World Haptics ’19.
- Mar 2019: Full paper accepted for ACM Pervasive Displays ’19.
- Feb 2019: Two demos accepted for CHI ’19.
- Nov ’18: Demo at SICSA DemoFest in Edinburgh, Scotland
- Jun ’18: Presenting paper at Pervasive Displays ’18 in Munich, Germany
- Apr ’18: Presenting paper at CHI ’18 in Montreal, Canada
- Mar ’18: ACM Pervasive Displays ’18 paper accepted
- Dec ’17: ACM CHI ’18 paper accepted
- Oct ’17: Demo at ACM ISS ’17 in Brighton, England
- Oct ’17: Demo at SICSA DemoFest in Edinburgh, Scotland
- Sep ’17: ACM ICMI ’17 demo accepted
- Aug ’17: ACM ISS ’17 demo accepted
- Jun ’17: Demo at Pervasive Displays conference in Switzerland
- Jan ’17: Project started
Website & Social
- Enhancing Physical Objects with Actuated Levitating Particles
- Euan Freeman, Asier Marzo, Praxitelis B. Kourtelos, Julie R. Williamson, and Stephen Brewster.
- Proceedings of the 8th ACM International Symposium on Pervasive Displays (PerDis ’19).
- Levitating Object Displays with Interactive Voxels
- Euan Freeman, Julie Williamson, Praxitelis Kourtelos, and Stephen Brewster.
- Proceedings of the 7th ACM International Symposium on Pervasive Displays (PerDis ’18).
- Point-and-Shake: Selecting from Levitating Object Displays.
- Euan Freeman, Julie Williamson, Sriram Subramanian, and Stephen Brewster.
- Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems (CHI ’18).
- Textured Surfaces for Ultrasound Haptic Displays.
- Euan Freeman, Ross Anderson, Julie Williamson, Graham Wilson, and Stephen Brewster.
- Demo at the ACM International Conference on Multimodal Interaction (ICMI ’17).
- Floating Widgets: Interaction with Acoustically-Levitated Widgets.
- Euan Freeman, Ross Anderson, Carl Andersson, Julie Williamson, and Stephen Brewster.
- Demo at ACM Interactive Surfaces and Spaces 2017 (ISS ’17).
- Levitate: Interaction with Floating Particle Displays.
- Julie Williamson, Euan Freeman, and Stephen Brewster.
- Demo at Pervasive Displays 2017 (PerDis ’17).
We’re working with some awesome people on this project: Chalmers University of Technology in Gothenburg, Sweden; Bayreuth University in Bayreuth, Germany; University of Sussex in Brighton, England; and Ultrahaptics in Bristol, England.
At University of Glasgow, we have Stephen Brewster, Julie Williamson, Euan Freeman and Praxitelis Kourtelos working on the project. We frequently collaborate with Dennis and Debbie from realrealreal.
This research has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement #737087.