Levitate project logo

Levitate

The Levitate project is investigating a radical new way of interacting with data by using levitating particles to create a mid-air display. Users will be able see, hear and feel objects that levitate in front of them, without having to wear any other device.

The following image shows an example of acoustic levitation, using one of our prototypes. It shows two small polystyrene beads, which are being levitated between two arrays of ultrasound transducers. By manipulating the acoustic field between the transducers, we are able to reposition the objects in three dimensions.

An acoustic levitation device with two small beads inside.

Our role on the Levitate project is to investigate new types of interaction with levitating objects. We have been exploring ways of providing input to systems consisting of levitating objects. We’ve also been thinking about how we can use levitating objects to present feedback and content to users. As part of this project we are also investigating mid-air ultrasound haptic feedback.

Research Highlights

Quick Updates

  • Nov ’18: Demo at SICSA DemoFest in Edinburgh, Scotland
  • Jun ’18: Presenting paper at Pervasive Displays ’18 in Munich, Germany
  • Apr ’18: Presenting paper at CHI ’18 in Montreal, Canada
  • Mar ’18: ACM Pervasive Displays ’18 paper accepted
  • Dec ’17: ACM CHI ’18 paper accepted
  • Oct ’17: Demo at ACM ISS ’17 in Brighton, England
  • Oct ’17: Demo at SICSA DemoFest in Edinburgh, Scotland
  • Sep ’17: ACM ICMI ’17 demo accepted
  • Aug ’17: ACM ISS ’17 demo accepted
  • Jun ’17: Demo at Pervasive Displays conference in Switzerland
  • Jan ’17: Project started

Website & Social

Publications

  • Levitating Object Displays with Interactive Voxels
    • Euan Freeman, Julie Williamson, Praxitelis Kourtelos, and Stephen Brewster.
    • Proceedings of the 7th ACM International Symposium on Pervasive Displays (PerDis ’18).
    • [link]
  • Point-and-Shake: Selecting from Levitating Object Displays.
    • Euan Freeman, Julie Williamson, Sriram Subramanian, and Stephen Brewster.
    • Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems (CHI ’18).
    • [link]
  • Textured Surfaces for Ultrasound Haptic Displays.
    • Euan Freeman, Ross Anderson, Julie Williamson, Graham Wilson, and Stephen Brewster.
    • Demo at the ACM International Conference on Multimodal Interaction (ICMI ’17).
  • Floating Widgets: Interaction with Acoustically-Levitated Widgets.
    • Euan Freeman, Ross Anderson, Carl Andersson, Julie Williamson, and Stephen Brewster.
    • Demo at ACM Interactive Surfaces and Spaces 2017 (ISS ’17).
    • [link]
  • Levitate: Interaction with Floating Particle Displays.
    • Julie Williamson, Euan Freeman, and Stephen Brewster.
    • Demo at Pervasive Displays 2017 (PerDis ’17).
    • [link]

Project Team

We’re working with some awesome people on this project: Chalmers University of Technology in Gothenburg, Sweden; Bayreuth University in Bayreuth, Germany; University of Sussex in Brighton, England; and Ultrahaptics in Bristol, England.

At University of Glasgow, we have Stephen Brewster, Julie Williamson, Euan Freeman and Praxitelis Kourtelos working on the project.

glasgowlogo  chalmerslogo    sussexlogo  ultrahapticslogo

Acknowledgements

This research has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement #737087.

flag_yellow_low