PRESTIGE: Printed Interactive Materials for End-User Products

The PRESTIGE project is an EU H2020 collaboration between 16 partners developing novel printed sensing, actuation and construction materials (electroactive fluorinated polymers, photoactive materials, electroactive organic moeities, fluorinated
relaxor terpolymers, tailor-made polymers for overmoulding and organo-mineral coating), integrating them into TRL7 demonstrators: haptic feedback on steering wheels, energy-harvesting wearables and e-plastic labelling with oleophobic coating for sustainable product packaging.

During the project, Glasgow is prototyping haptic feedback for in-car interfaces and designing user interactions for novel smart-sensing product packaging.

TEAM ITN

TEAM (Technology Enabled Mental Health for Young People) is a 4 year Innovation Training Network (ITN) that focuses on the design, development and evaluation of new technology-enabled mental health services for young people. The overall objective is to train a new generation of researchers who can deliver more effective, affordable and accessible mental health services for young people.

Research Objectives

TEAM’s research program is built around four key research objectives, each addressing a key challenge in improving mental health services:

  1. Assessment
  2. Prevention
  3. Treatment
  4. Policy

Website & Social

TEAM Consortium

TEAM consists of a total of 15 Early Stage Researchers, and it involves 9 beneficiaries:

  • Four universities (Technical University of Denmark, Technical University Vienna, University of Glasgow and University College Dublin);
  • Two university hospitals (Medical University Vienna, Psychiatric Centre Copenhagen (Region Hovedstaden));
  • Two not-for-profit organisations (The Anna Freud National Centre for Children and Families, ReachOut Ireland Ltd); and
  • One industry research laboratory (Telefonica Innovacion Alpha).

   

Acknowledgements

TEAM is an Innovative Training Network Funded by the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 722561.

 

Thermal Feedback for In-Car Applications

This project, which was funded by Jaguar Land Rover and the EPRSC, investigated thermal feedback in cars.

In modern cars, most of the information is presented to the driver in visual form, frequently on the centre console, but taking the visual attention off the road contributes to crashes and near-crash incidents. Auditory feedback is often described as disruptive during conversations. Haptic feedback, on the other hand, can be presented unobtrusively and can increase reaction times, especially when used with other feedback modalities. In this project, we explore new ways of haptic interacting in the car: thermal feedback. Vibration can be hard to use due to the natural vibration of the car during driving and because the source of the vibration can be hard to pinpoint on a steering wheel. Thermal interaction does not share this disadvantage.

In the project thermal feedback has been explored for directional cues and notifications.

A press release by Jaguar Land Rover illustrates the use of thermal feedback for navigation.

Publications:

Purring Wheel: Thermal and Vibrotactile Notifications on the Steering Wheel, Patrizia Di Campli San Vito, Frank Pollick, Simon Thompson, Lee Skrypchuk, Alexandros Mouzakitis and Stephen Brewster
In Proceedings of 22nd ACM International Conference on Multimodal Interaction – ICMI ’20
https://doi.org/10.1145/3382507.3418825. Link

Haptic Feedback for the Transfer of Control in Autonomous Vehicles, Patrizia Di Campli San Vito, Edward Brown, Frank Pollick, Simon Thompson, Lee Skrypchuk, Alexandros Mouzakitis and Stephen Brewster
In Adjunct Proceedings of the 12th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications – AutoUI ’20,
https://doi.org/10.1145/3409251.3411717. Link

Thermal Feedback for Simulated Lane Change Scenarios, Patrizia Di Campli San Vito, Frank Pollick, Stuart White, Lee Skrypchuk, Alexandros Mouzakitis and Stephen Brewster
Special Issue of the International Journal of Mobile Human Computer Interaction (IJMHCI): Recent Advances in Automotive User Interfaces and Interactive Vehicular Applications Research (2019),
https://doi.org/10.4018/IJMHCI.2019040103.

Haptic Navigation Cues on the Steering Wheel, Patrizia Di Campli San Vito, Gözel Shakeri,
Frank Pollick, Edward Brown, Lee Skrypchuk, Alexandros Mouzakitis and Stephen Brewster
In Proceedings of the 37th Annual ACM Conference on Human Factors in Computing Systems – CHI ’19,
https://doi.org/10.1145/3290605.3300440. Link

Investigation of Thermal Stimuli for Lane Changes, Patrizia Di Campli San Vito, Frank Pollick, Stuart White, Lee Skrypchuk, Alexandros Mouzakitis and Stephen Brewster
In Proceedings of the 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications – AutoUI ’18,
https://doi.org/10.1145/3239060.3239062. Link

Thermal In-Car Interaction for Navigation, Patrizia Di Campli San Vito,
Frank Pollick, Stuart White and Stephen Brewster
In Proceedings of the 19th ACM International Conference on Multimodal Interaction – ICMI ’17 (Demonstration),
https://doi.org/10.1145/3136755.3143029. Link

    

Synchronous At-A-Distance TV and VR

Introduction

Often, the focus of shared experiences is on those that occur in the same place/space, at the same time (collocated). However, this work examined synchronous at-a-distance media consumption, meaning users being anywhere from in different rooms, to different continents, from two perspectives: how it can be
facilitated using existing consumer displays (through TVs combined with smartphones), and imminently
available consumer displays (through VR HMDs combined with RGBD sensing).

In a TOCHI journal paper published on this project, we discuss results from an initial evaluation of a synchronous shared at-a-distance smart TV system, CastAway. Through week-long in-home deployments with five couples, we gain formative insights into the adoption and usage of at-a-distance media consumption and how couples communicated during said consumption.

We then examined how the imminent availability and potential adoption of consumer VR HMDs could affect preferences toward how synchronous at-a-distance media consumption is conducted, in a laboratory study of 12 pairs, by enhancing media immersion and supporting embodied telepresence for communication.

Combined, these studies begin to explore a design space regarding the varying ways in which at-a-distance media consumption can be supported and experienced (through music, TV content, augmenting existing TV content for immersion, and immersive VR content), what factors might influence usage and adoption and the implications for supporting communication and telepresence during media consumption.

Publications

  • McGill, M., Williamson, J. and Brewster, S.: Examining The Role Of Smart TVs And VR HMDs In Synchronous At-A-Distance Media Consumption. In ACM Transactions on Computer-Human Interaction, ACM, 2016.  Link.

In-Car use of VR HMDs by passengers

Introduction

Immersive HMDs are becoming everyday consumer items and, as they offer new possibilities for entertainment and productivity, people will want to use them during travel in, for example, autonomous cars. However, their use is confounded by motion sickness caused in-part by the restricted visual perception of motion conflicting with physically perceived vehicle motion (accelerations/rotations detected by the vestibular system). Whilst VR HMDs restrict visual perception of motion, they could also render it virtually, potentially alleviating sensory conflict.

Accordingly, we conducted the first on-road and in motion study to systematically investigate the effects of various visual presentations of the real-world motion of a car on the sickness and immersion of VR HMD wearing passengers. We established new baselines for VR in-car motion sickness, and found that there is no one best presentation with respect to balancing sickness and immersion. Instead, user preferences suggest different solutions are required for differently susceptible users to provide usable VR in-car. This work was published in CHI 2017 (see reference below), receiving an Honorable Mention award in the process, and provided formative insights for VR designers as well as an entry point for further research into enabling use of VR HMDs, and the rich experiences they offer, when travelling.

Publications

  •  McGill, M., Ng, A., and Brewster, S.: I Am The Passenger: How Visual Motion Cues Can Influence Sickness For In-Car VR. In Proceedings of the 35th Conference on Human Factors in Computing Systems – CHI ’17, ACM Press, 2017.
  • How Visual Motion Cues Can Influence Sickness For In-Car VR. Mark McGill, Alexander Ng and Stephen Brewster. Video showcase at CHI 2017.
  • I Am The Passenger: Challenges in
    Supporting AR/VR HMDs In-Motion
    . Mark McGill and Stephen Brewster. Video showcase of AUTOUI 2017.
  • Challenges in Supporting AR/VR HMDs In-Motion. Mark McGill and Stephen Brewster. 2017 Workshop on Augmented Reality for Intelligent Vehicles ARIV @ AUTOUI 2017.

3/2/17 ABBI Project Completed

This week saw the conclusion of our EU-funded ABBI project, a three-year collaboration with the Istituto Italiano di Tecnologia, Lund University, University of Hamburg, and the Chiossone Institute. You can read more about the ABBI project here.

The primary aim of the project was to develop and evaluate an audio bracelet to assist in the rehabilitation of young visually impaired children, to help them develop spatial understanding and movement skills. Our recent research on the project investigated how the capabilities of the ABBI audio bracelet could be used in other ways, to assist visually impaired children. This culminated in our CHI 2017 paper about audible beacons and how sound from such devices could be used to support independent play and movement. The following video gives a summary of this work.

Levitate

The Levitate project is investigating a radical new way of interacting with data by using levitating particles to create a mid-air display. Users will be able see, hear and feel objects that levitate in front of them, without having to wear any other device.

The following image shows an example of acoustic levitation, using one of our prototype devices. It shows eight polystyrene beads at the corners of a cube, which are being levitated between two arrays of ultrasound transducers. By manipulating the acoustic field between the transducers, we are able to move the objects independently in three dimensions.

Our role on the Levitate project is to investigate new types of interaction with levitating objects. We have been exploring ways of providing input to systems consisting of levitating objects. We’ve also been thinking about how we can use levitating objects to present feedback and content to users. As part of this project we are also investigating mid-air ultrasound haptic feedback.

Research Highlights

Highlights

  • Apr 2020: ACM CHI 2020 Late Breaking Works paper accepted, Honolulu, Hawaii
  • Apr 2019: Full paper accepted for IEEE World Haptics ’19.
  • Mar 2019: Full paper accepted for ACM Pervasive Displays ’19.
  • Feb 2019: Two demos accepted for CHI ’19.
  • Nov ’18: Demo at SICSA DemoFest in Edinburgh, Scotland
  • Jun ’18: Presenting paper at Pervasive Displays ’18 in Munich, Germany
  • Apr ’18: Presenting paper at CHI ’18 in Montreal, Canada
  • Mar ’18: ACM Pervasive Displays ’18 paper accepted
  • Dec ’17: ACM CHI ’18 paper accepted
  • Oct ’17: Demo at ACM ISS ’17 in Brighton, England
  • Oct ’17: Demo at SICSA DemoFest in Edinburgh, Scotland
  • Sep ’17: ACM ICMI ’17 demo accepted
  • Aug ’17: ACM ISS ’17 demo accepted
  • Jun ’17: Demo at Pervasive Displays conference in Switzerland
  • Jan ’17: Project started

Website & Social

Publications

  • Enhancing Physical Objects with Actuated Levitating Particles
    • Euan Freeman, Asier Marzo, Praxitelis B. Kourtelos, Julie R. Williamson, and Stephen Brewster.
    • Proceedings of the 8th ACM International Symposium on Pervasive Displays (PerDis ’19).
  • Levitating Object Displays with Interactive Voxels
    • Euan Freeman, Julie Williamson, Praxitelis Kourtelos, and Stephen Brewster.
    • Proceedings of the 7th ACM International Symposium on Pervasive Displays (PerDis ’18).
    • [link]
  • Point-and-Shake: Selecting from Levitating Object Displays.
    • Euan Freeman, Julie Williamson, Sriram Subramanian, and Stephen Brewster.
    • Proceedings of the 36th Annual ACM Conference on Human Factors in Computing Systems (CHI ’18).
    • [link]
  • Textured Surfaces for Ultrasound Haptic Displays.
    • Euan Freeman, Ross Anderson, Julie Williamson, Graham Wilson, and Stephen Brewster.
    • Demo at the ACM International Conference on Multimodal Interaction (ICMI ’17).
  • Floating Widgets: Interaction with Acoustically-Levitated Widgets.
    • Euan Freeman, Ross Anderson, Carl Andersson, Julie Williamson, and Stephen Brewster.
    • Demo at ACM Interactive Surfaces and Spaces 2017 (ISS ’17).
    • [link]
  • Levitate: Interaction with Floating Particle Displays.
    • Julie Williamson, Euan Freeman, and Stephen Brewster.
    • Demo at Pervasive Displays 2017 (PerDis ’17).
    • [link]

Project Team

We’re working with some awesome people on this project: Chalmers University of Technology in Gothenburg, Sweden; Bayreuth University in Bayreuth, Germany; University of Sussex in Brighton, England; and Ultrahaptics in Bristol, England.

At University of Glasgow, we have Stephen Brewster, Julie Williamson, Euan Freeman and Praxitelis Kourtelos working on the project. We frequently collaborate with Dennis and Debbie from realrealreal.

glasgowlogo  chalmerslogo    sussexlogo  ultrahapticslogo

Acknowledgements

This research has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement #737087.

flag_yellow_low

ABBI

Introduction

The ABBI (Audio Bracelet for Blind Interaction) project aims at improving spatial cognition, mobility and social skills in visually impaired children. It will develop motion-sensitive “bracelets” that use sound to convey spatial information (e.g., movement, position), to help develop spatial awareness and understanding. Our role on the project at Glasgow is to develop software for the project, to research sound design, and to investigate novel ways of using the ABBI bracelets to help people with visual impairment.

Research Summaries

Publications

  • Audible Beacons and Wearables in Schools: Helping Young Visually Impaired Children Play and Move Independently. Euan Freeman, Graham Wilson, Stephen Brewster, Gabriel Baud-Bovy, Charlotte Magnusson, and Hector Caltenco. Proceedings of CHI 2017.
  • Towards a Multimodal Adaptive Lighting System for Visually Impaired Children. Euan Freeman, Graham Wilson, and Stephen Brewster. Proceedings of ICMI 2016 Demos.
  • Multimodal Affective Feedback: Combining Thermal, Vibrotactile, Audio and Visual Signals. Graham Wilson, Euan Freeman, and Stephen Brewster. Proceedings of ICMI 2016 Demos.
  • Automatically Adapting Home Lighting to Assist Visually Impaired Children. Euan Freeman, Graham Wilson, and Stephen Brewster. Proceedings of NordiCHI 2016 Posters.
  • Using Dynamic Audio Feedback to Support Peripersonal Reaching in Young Visually Impaired People. Graham Wilson and Stephen Brewster. Proceedings of ASSETS 2016.
  • Automatically Adapting Home Lighting to Assist Visually Impaired Children. Euan Freeman, Graham Wilson, and Stephen Brewster. Proceedings of CHI 2016 Extended Abstracts.
  • Using Dynamic Audio Feedback to Support Peripersonal Reaching in Visually Impaired People. Graham Wilson and Stephen Brewster. Proceedings of ASSETS 2015 Demos.
  • Effects of Sound Type on Recreating the Trajectory of a Moving Source. Graham Wilson, Stephen Brewster, Hector Caltenco, Charlotte Magnusson, Sara Finocchietti, Gabriel Baud-Bovy, and Monica Gori. Proceedings of CHI 2015 Extended Abstracts.

Website & Social

Project Partners

The ABBI project comprised five partners from a variety of backgrounds. Our role was to investigate feedback and interaction design, whilst developing software for the project.

glasgowlogo   lundlogo_wide   iit_logo   uegnr7kt   uhh_logo

Acknowledgements

This project was funded by the European Commission’s FP7 programme (#611452).

flag_yellow_low

ABBI: Adaptive Lighting for Visually Impaired Children

Introduction

The ABBI (Audible Bracelet for Blind Interaction) project has been developing an audio bracelet for visually impaired children. Audio bracelets are wearable sound sources that produce sound in response to movement. The main purpose of these audio bracelets is rehabilitation for visually impaired children, as they can be used in activities that improve spatial cognition.

Adapting home lighting

The ABBI bracelet has other capabilities beyond making sounds in response to movement. One of these is the ability to be a Bluetooth beacon, a device that can be used for estimating proximity to people or places. At University of Glasgow, we investigated if we could use the beacon capabilities of the ABBI bracelet to adapt the lighting at home, based on a child’s location and activity. For example, if a child entered their bedroom and we could detect that, then we could automatically turn the lights on and increase them to maximum brightness, to help them see furniture and obstacles. We could also change the lighting during play to stimulate a child’s vision; for example, a coloured lamp could create colourful patterns in response to an ABBI bracelet moving around the room.

A video at the bottom of this page demonstrates a prototype of an adaptive lighting system. We describe the development and initial evaluation of this prototype in two papers, also listed below.

Publications

  • Freeman, E., Wilson, G. and Brewster, S.: Automatically Adapting Home Lighting to Assist Visually Impaired Children. In Proceedings of the 9th Nordic Conference on Human-Computer Interaction – NordiCHI ’16, ACM Press, 2016.
  • Freeman, E., Wilson, G., and Brewster, S.: Towards a Multimodal Adaptive Lighting System for Visually Impaired Children. In Proceedings of ICMI 2016 Demonstrations, ACM Press, pp. 398-399, 2016.

Video demonstration

ABBI: Audible Beacons for Visual Impairment

Introduction

The ABBI (Audible Bracelet for Blind Interaction) project has been developing an audio bracelet for visually impaired children. Audio bracelets are wearable sound sources that produce sound in response to movement. The main purpose of these audio bracelets is rehabilitation for visually impaired children, as they can be used in activities that improve spatial cognition. Some of our research at Glasgow investigated other ways of using sound to help visually impaired children, specifically while at nursery or school.

Audio Feedback in Schools

We spoke to visual impairment education experts to discover the problems that young visually impaired children have while at nursery and school. These discussions uncovered a number of problems, mostly relating to play and social activities. We also received many suggestions of how sound from audio bracelets and other devices could be used to address these problems; for example, using sound to encourage children to try new activities. Please see our CHI 2017 paper (details at end) for more about what we learned from this study.

We developed three scenarios that represent the issues we learned about, with examples of how sound from audio bracelets could be used to help. An online survey for visual impairment experts investigated design issues relating to these scenarios and their use of audio feedback. One of our main findings from this survey is that sound should not just come from audio bracelets. Instead, sound should also come from places in the room and from other peoples’ locations in the room. We also learned that speech and familiar sounds (e.g., from objects) should be used to inform children about nearby places and activities.

Audible Beacons

Our findings from these studies led to the development of Audible Beacons, devices that can produce sound and be used for estimating proximity to people or places. They are essentially Bluetooth beacons that can be remotely controlled to produce audio feedback. A small form-factor is necessary so that the beacons can be worn by children (like an audio bracelet) or placed in the room (like a beacon).

Audible Beacons combine audio output with Bluetooth beacon capabilities. They can be worn like an audio bracelet or placed in the room like a beacon.

Audible Beacons combine audio output with Bluetooth beacon capabilities. They can be worn like an audio bracelet or placed in the room like a beacon.

The audio bracelet produced by the ABBI project (shown below) has full Audible Beacon capabilities: it has Bluetooth for remote control, it has beacon functionality, and it can synthesise audio on demand.

The ABBI audio bracelet. A 3D printed enclosure can be worn on a wrist strap or placed in a room.

The ABBI audio bracelet. A 3D printed enclosure can be worn on a wrist strap or placed in a room.

Publications

  • Audible Beacons and Wearables in Schools: Helping Young Visually Impaired Children Play and Move Independently. Euan Freeman, Graham Wilson, Stephen Brewster, Gabriel Baud-Bovy, Charlotte Magnusson, and Hector Caltenco. Proceedings of CHI 2017.
  • Automatically Adapting Home Lighting to Assist Visually Impaired Children. Euan Freeman, Graham Wilson, and Stephen Brewster. Proceedings of CHI 2016 Extended Abstracts.

Videos

The following video accompanies our CHI 2017 paper.