16/4/18 CHI 2018

We’re super excited about CHI 2018, which starts next week! We have three full papers at the conference and are also participating in some workshops.

Full Papers

  • Point-and-Shake: Selecting from Levitating Object Displays.
    • Euan Freeman, Julie Williamson, Sriram Subramanian and Stephen Brewster.
    • From the Levitate project.
  • Object Manipulation in Virtual Reality Under Increasing Levels of Translational Gain.
    • Graham Wilson, Mark McGill, Matthew Jamieson, Julie Williamson and Stephen Brewster.
  • Investigating Perceptual Congruence Between Data and Display Dimensions in Sonification.
    • Jamie Ferguson and Stephen Brewster.

Workshop Papers

  • Haptic Feedback for Mid-Air Gestures: What, When, How?
    • Euan Freeman, Praxitelis Kourtelos, Julie Williamson, and Stephen Brewster.
    • CHI 2018 Workshop on Mid-Air Haptics for Control Interfaces.
    • From the Levitate project.
  • On-Wheel vs Mid-Air Haptics: Where is Best for In-Car Interaction?
    • Graham Wilson and Stephen Brewster.
    • CHI 2018 Workshop on Mid-Air Haptics for Control Interfaces.
    • From the Prestige project.

Video Previews

28/9/17 SICSA DemoFest

We are going to be at SICSA DemoFest ’17 in Edinburgh on the 3rd of October. SICSA DemoFest is a yearly event that showcases Scottish computing science research and is mainly focused towards industry.

We’ll have demonstrations of acoustic levitation and mid-air haptics (from Levitate), future in-car interfaces (from our In-Car Interaction research), and new types of audio feedback.

15/5/17 Book Chapter on Multimodal Feedback

We contributed a book chapter to the recently published Handbook of Multimodal-Multisensor Interfaces. Our chapter (link) presents an overview of haptic and non-speech audio feedback, discusses examples and benefits of multimodal feedback, and looks at application areas where multimodal feedback can improve usability.

3/2/17 ABBI Project Completed

This week saw the conclusion of our EU-funded ABBI project, a three-year collaboration with the Istituto Italiano di Tecnologia, Lund University, University of Hamburg, and the Chiossone Institute. You can read more about the ABBI project here.

The primary aim of the project was to develop and evaluate an audio bracelet to assist in the rehabilitation of young visually impaired children, to help them develop spatial understanding and movement skills. Our recent research on the project investigated how the capabilities of the ABBI audio bracelet could be used in other ways, to assist visually impaired children. This culminated in our CHI 2017 paper about audible beacons and how sound from such devices could be used to support independent play and movement. The following video gives a summary of this work.

10/1/17 CHI 2017 Papers

We are pleased to announce that we have had five full papers accepted to CHI 2017:

  • Audible Beacons and Wearables in Schools: Helping Young Visually Impaired Children Play and Move Independently.
    • Euan Freeman, Graham Wilson, Stephen Brewster, Gabriel Baud-Bovy, Charlotte Magnusson, and Hector Caltenco
    • From the ABBI project
  • Multi-moji: Combining Thermal, Vibrotactile & Visual Stimuli to Expand the Affective Range of Feedback.
    • Graham Wilson and Stephen Brewster
  • An Evaluation of Input Controls for In-Car Interactions.
    • Alexander Ng, Stephen Brewster, Frank Beruscha, and Wolfgang Krautter
    • From the HAPPINESS project
  • I Am The Passenger: How Visual Motion Cues Can Influence Sickness For In-Car VR.
    • Mark McGill, Alexander Ng, and Stephen Brewster
  • ForgetMeNot: Active Reminder Entry Support for Adults with Acquired Brain Injury.
    • Matthew Jamieson, Brian O’Neill, Breda Cullen, Marilyn McGee-Lennon, Stephen Brewster, and Jonathan Evans

Mark McGill will also be presenting an ACM TOCHI article at CHI. More details about this will be added soon.


9/1/17 Introducing Levitate

Our new year is off to an exciting start – this month we are kicking off Levitate, a four-year EU project. We are going to be working with the University of Sussex, Chalmers University of Technology, Aarhus University and UltraHaptics, to develop a radical new display based on levitating particles.

Levitate envisions a new form of human-computer interaction where users can reach out and interact with physical objects in mid-air. Our technology aims to allow users to see, feel and hear the effects of their interactions with the levitating objects. Be sure to check back shortly for more information about this exciting new project as it develops.

6/1/17 Two new PhD students

Two new PhD students joined us in October 2016 – welcome Paddy and Jamie! Paddy’s research is funded by a studentship from Jaguar-Land Rover and is going to be about in-car haptic interactions. Jamie’s research is investigating the sonification of astronomical and astrophysical data.

4/10/16 ICMI and NordiCHI

We’re going to be at ICMI in Tokyo and NordiCHI in Gothenburg later in the year, presenting research from the ABBI Project. Look our for our posters and demos and come and say hello!

At NordiCHI we’ll be presenting a poster:

  • Automatically Adapting Home Lighting to Assist Visually Impaired Children
    • Euan Freeman, Graham Wilson, and Stephen Brewster

And at ICMI we are presenting two demos:

  • Towards a Multimodal Adaptive Lighting System for Visually Impaired Children
    • Euan Freeman, Graham Wilson, and Stephen Brewster
  • Multimodal Affective Feedback: Combining Thermal, Vibrotactile, Audio and Visual Signals
    • Graham Wilson, Euan Freeman, and Stephen Brewster