16/4/18 CHI 2018

We’re super excited about CHI 2018, which starts next week! We have three full papers at the conference and are also participating in some workshops.

Full Papers

  • Point-and-Shake: Selecting from Levitating Object Displays.
    • Euan Freeman, Julie Williamson, Sriram Subramanian and Stephen Brewster.
    • From the Levitate project.
  • Object Manipulation in Virtual Reality Under Increasing Levels of Translational Gain.
    • Graham Wilson, Mark McGill, Matthew Jamieson, Julie Williamson and Stephen Brewster.
  • Investigating Perceptual Congruence Between Data and Display Dimensions in Sonification.
    • Jamie Ferguson and Stephen Brewster.

Workshop Papers

  • Haptic Feedback for Mid-Air Gestures: What, When, How?
    • Euan Freeman, Praxitelis Kourtelos, Julie Williamson, and Stephen Brewster.
    • CHI 2018 Workshop on Mid-Air Haptics for Control Interfaces.
    • From the Levitate project.
  • On-Wheel vs Mid-Air Haptics: Where is Best for In-Car Interaction?
    • Graham Wilson and Stephen Brewster.
    • CHI 2018 Workshop on Mid-Air Haptics for Control Interfaces.
    • From the Prestige project.

Video Previews

10/1/17 CHI 2017 Papers

We are pleased to announce that we have had five full papers accepted to CHI 2017:

  • Audible Beacons and Wearables in Schools: Helping Young Visually Impaired Children Play and Move Independently.
    • Euan Freeman, Graham Wilson, Stephen Brewster, Gabriel Baud-Bovy, Charlotte Magnusson, and Hector Caltenco
    • From the ABBI project
  • Multi-moji: Combining Thermal, Vibrotactile & Visual Stimuli to Expand the Affective Range of Feedback.
    • Graham Wilson and Stephen Brewster
  • An Evaluation of Input Controls for In-Car Interactions.
    • Alexander Ng, Stephen Brewster, Frank Beruscha, and Wolfgang Krautter
    • From the HAPPINESS project
  • I Am The Passenger: How Visual Motion Cues Can Influence Sickness For In-Car VR.
    • Mark McGill, Alexander Ng, and Stephen Brewster
  • ForgetMeNot: Active Reminder Entry Support for Adults with Acquired Brain Injury.
    • Matthew Jamieson, Brian O’Neill, Breda Cullen, Marilyn McGee-Lennon, Stephen Brewster, and Jonathan Evans

Mark McGill will also be presenting an ACM TOCHI article at CHI. More details about this will be added soon.

Videos

11/5/16 CHI ’16

San Jose Convention Centre

We are in sunny San Jose! If you see the Glasgow Interactive Systems Group logo then come and say hi! We still have a couple of talks this afternoon and we’ll be near two posters in the Extending User Capabilities section.

4/5/16 MIG at CHI ’16

Next week we’re going to swap the dreary Scottish weather for the warmth of San Jose, California, to attend CHI 2016. We’ll be there presenting two full papers, three late-breaking work posters and a workshop paper. Look out for the Glasgow Interactive Systems Group logo and come and say hi! We look forward to seeing you there!

Papers at CHI ’16

  • Do That, There: An Interaction Technique for Addressing In-Air Gesture Systems. Euan Freeman, Stephen Brewster, and Vuokko Lantz. (full paper)
  • Hot Under the Collar: Mapping Thermal Feedback to Dimensional Models of Emotion. Graham Wilson, Dobromir Dobrev, and Stephen Brewster. (full paper)
  • Using Sound to Help Visually Impaired Children Play Independently. Euan Freeman and Stephen Brewster. (late-breaking work)
  • Evaluating Haptic Feedback on a Steering Wheel in a Simulated Driving Scenario. Gozel Shakeri, Stephen Brewster, John Williamson, and Alex Ng. (late-breaking work)
  • Mapping Abstract Visual Feedback to a Dimensional Model of Emotion. Graham Wilson, Pietro Romeo, and Stephen Brewster. (late-breaking work)
  • Towards Mid-Air Haptic Widgets. Euan Freeman, Dong-Bach Vo, Graham Wilson, Gozel Shakeri, and Stephen Brewster. (workshop paper)

30-second Preview Videos

5/3/15 CHI Preview #3: Usability of Virtual Reality

A Dose of Reality: Overcoming Usability Challenges in VR Head-Mounted Displays

by Mark McGill, Daniel Boland, Roderick Murray-Smith and Stephen Brewster

Abstract

We identify usability challenges facing consumers adopting Virtual Reality (VR) head-mounted displays (HMDs) in a survey of 108 VR HMD users. Users reported significant issues in interacting with, and being aware of their real-world context when using a HMD. Building upon existing work on blending real and virtual environments, we performed three design studies to address these usability concerns. In a typing study, we show that augmenting VR with a view of reality significantly corrected the performance impairment of typing in VR. We then investigated how much reality should be incorporated and when, so as to preserve users’ sense of presence in VR. For interaction with objects and peripherals, we found that selectively presenting reality as users engaged with it was optimal in terms of performance and users’ sense of presence. Finally, we investigated how this selective, engagement-dependent approach could be applied in social environments, to support the user’s awareness of the proximity and presence of others.

Video Preview

4/3/15 CHI Preview #2: In-Car Multimodal Displays

To Beep or Not to Beep? Comparing Abstract versus Language-based Multimodal Driver Displays

by Ioannis Politis, Stephen Brewster and Frank Pollick

Abstract

Multimodal displays are increasingly being utilized as driver warnings. Abstract warnings, without any semantic association to the signified event, and language-based warnings are examples of such displays. This paper presents a first comparison between these two types, across all combinations of audio, visual and tactile modalities. Speech, text and Speech Tactons (a novel form of tactile warnings synchronous to speech) were compared to abstract pulses in two experiments. Results showed that recognition times of warning urgency during a non-critical driving situation were shorter for abstract warnings, highly urgent warnings and warnings including visual feedback. Response times during a critical situation were shorter for warnings including audio. We therefore suggest abstract visual feedback when informing drivers during a non-critical situation and audio in a highly critical one. Language-based warnings during a critical situation performed equally well as abstract ones, so they are suggested as less annoying vehicle alerts.

Video Preview

2/3/15 CHI Preview #1: Thermal Feedback

In the Heat of the Moment: Subjective Interpretations of Thermal Feedback During Interaction

by Graham Wilson, Gavin Davidson and Stephen Brewster

Description

Thermal feedback can be an engaging and convincing means of conveying information, but its meaning in interaction can be ambiguous, so interface designers may not be sure how users naïvely interpret thermal feedback during interaction. The research in this paper measured subjective interpretations of thermal stimuli in four different scenarios: social media activity, a colleague’s presence, the extent of use of digital content and restaurant experiences and provides guidelines for the design of thermal feedback are presented to help others create effective thermal interfaces.

Abstract

Research has shown that thermal feedback can be an engaging and convincing means of conveying experimenter-predefined meanings, e.g., material properties or message types. However, thermal perception is subjective and its meaning in interaction can be ambiguous. Interface designers may not be sure how users could naïvely interpret thermal feedback during interaction. Little is also known about how users would choose thermal cues to convey their own meanings. The research in this paper tested subjective interpretations of thermal stimuli in three different scenarios: social media activity, a colleague’s presence and the extent of use of digital content. Participants were also asked to assign their own thermal stimuli to personal experiences, to help us understand what kinds of stimuli people associate with different meanings. The results showed strong agreement among participants concerning what warmth (presence, activity, quality) and cool mean (absence, poor quality). Guidelines for the design of thermal feedback are presented to help others create effective thermal interfaces.

Video Preview

18/02/15 Three Papers and a WiP at CHI ’15

We have had three full papers accepted for CHI ’15 in Seoul, Korea! Mark’s looks at techniques for brining reality into virtual reality, Graham’s describes a series of experiments on thermal output and Ioannis’ is about in-car multimodal notifications. Graham also has a WiP on recreating the trajectories of moving sound sources for ABBI. See you in Seoul!

30/03/14 CHI preview #4

Perception of Ultrasonic Haptic Feedback on the Hand: Localisation and Apparent Motion

Authors: Graham Wilson, Thomas Carter, Sriram Subramanian and Stephen Brewster
CHI Session: Touch & Stylus Interaction
Presentation: Tuesday April 29th at 10:00 (Room 718B)

Summary
Ultrasonic haptic feedback involves the creation of focused air pressure waves from an array of ultrasound transducers. These are reflected off the skin to create tactile sensations without being in direct contact with an actuator. It is potentially useful for gestural interfaces, such as those that utilise body position, hand movements or finger gestures for input, as these interfaces suffer from a lack of tactile feedback. The technique is relatively new compared to other forms of tactile feedback, such as vibration motors or pin-arrays. Consequently, there has been less controlled and rigorous research into the perception of ultrasonic haptic feedback, which is vital if it is to be used in HCI.

We help to address this by identifying the factors that influence the perception of two fundamental aspects of tactile feedback: localisation and motion across the hand. Research on ultrasonic haptics has tested the detection or differentiation of one or multiple points of feedback, the two-point visual-tactile threshold and presented interaction prototypes with limited user studies. Research is needed on what spatial or temporal parameters influence localisation and perception of motion.

This paper presents two lab experiments. The first tested localisation of static feedback on the hand to determine spatial resolution for ultrasonic haptics. The second tested the perception of motion across two axes on the hand, to identify which characteristics of feedback (distance, duration, number of stimulated positions and movement direction) elicit convincing sensations of motion.

Abstract
Ultrasonic haptic feedback is a promising means of providing tactile sensations in mid-air without encumbering the user with an actuator. However, controlled and rigorous HCI research is needed to understand the basic characteristics of perception of this new feedback medium, and so how best to utilise ultrasonic haptics in an interface. This paper describes two experiments conducted into two fundamental aspects of ultrasonic haptic perception: 1) localisation of a static point and 2) the perception of motion. Understanding these would provide insight into 1) the spatial resolution of an ultrasonic interface and 2) what forms of feedback give the most convincing illusion of movement. Results show an average localisation error of 8.5mm, with higher error along the longitudinal axis. Convincing sensations of motion were produced when travelling longer distances, using longer stimulus durations and stimulating multiple points along the trajectory. Guidelines for feedback design are given.

Video Preview