The ABBI (Audio Bracelet for Blind Interaction) Project aims at improving spatial cognition, mobility and social skills in blind children, through motion-sensitive “bracelets” on the wrists/ankles and the use of the auditory modality to convey spatial information (movement, position). It is an EU-funded collaboration with IIT, Lund University, University of Hamburg and Instituto David Chiossone Onlus. The core idea of the ABBI system is to improve spatial cognition abilities of visually-impaired individuals through the use of other sensory modalities (e.g. touch and hearing). To that end, the project will develop technologies and procedures to rehabilitate brain processes and functions involved in spatial cognition of children and adults with visual disabilities through natural audio-motor and, possibly, tactile-motor association. In particular, the idea is that the auditory modality could play the role that visual signals normally play in the development of sensorimotor skills, spatial cognition, navigation and social interaction skills. This approach is based on a renewed understanding of the role of vision and on the role that another sensory signal (e.g. audio) associated with the motor signal might have in helping the blind child to build a sense of space.
The research being done at Glasgow University focuses on two aspects: 1) the design of suitable audio and haptic feedback and 2) exploring the potential of other wearable devices for providing information to the visually impaired. The Multimodal Interaction Group has an established history in designing non-visual feedback to support a variety of interfaces, including those for the visually impaired. For ABBI, our expertise will be applied to designing audio feedback designs that are clear, informative and enjoyable for users of various ages (from several months up to teenage years) and in environments with varying audio complexity.
With the proliferation of wearable devices, such as watches, glasses, arm bands etc. it is becoming easier to provide new sensors and feedback methods for the visually impaired in small and simple form factors. With greater access to information could come greater freedom and security. At Glasgow, we will look at expanding the information available to visually impaired users, including what information is most useful, how to convey it and how the user can control the amount and type of information that is presented to them.
Following the Route of a Moving Sound Source Through a Room
The ABBI (Audio Bracelet for Blind Interaction) device is designed for visually impaired and blind children to wear on the wrist and produce sound based on the movement of the arm through space. The primary function is to inform a child (or adult) about his/her own movements to aid spatial cognition rehabilitation. However, the device could also be worn by friends and family and be used to inform the visually impaired person of others’ movement in the environment. In this paper, we describe an initial experiment that measured how well blindfolded sighted individuals could track a moving sound source in 2D horizontal space and then walk the same route to the same end position. Six sounds, including natural sounds, abstract sounds, Earcons and speech, were compared to identify which type of sound produced more accurate route recreation. Our preliminary test suggests that all of the initial sounds facilitate recreation of 2D horizontal movement trajectories similarly well, although birdsong was problematic and speech and waves were more promising. This may mean that personalisation of ABBI sounds is possible while retaining their positive effects for rehabilitative support.
Supporting Accurate Reaching with Dynamic Audio Feedback
Blind children engage with their immediate environment much less than sighted children, particularly through self-initiated movement or exploration. Research has suggested that providing dynamic feedback about the environment and the child’s actions within/against it may help to encourage reaching activity and support spatial cognitive learning. This paper investigated whether the accuracy of peripersonal reaching can be improved by the use of dynamic sound from both the objects to reach for and the reaching hand itself (via a worn speaker). As a first step, we ran a study that tested the efficacy of static and dynamic audio feedback designs with blind and visually impaired young adults, to identify optimal feedback designs. The results showed that dynamic audio feedback helps to build connections and spatial links between the objects and the reaching hand and participants were able to reach for objects more accurately compared to constant (unchanging) feedback.
- Wilson, G. & Brewster, S. (2015) “Using Dynamic Audio Feedback to Support Peripersonal Reaching in Visually Impaired People”, to appear in Proceedings of ASSETS 2015, Oct 26-28, Lisbon, Portugal.
- Magnusson, C., Caltenco, H., Finocchietti, S., Cappagli, G., Wilson, G. & Gori, M. (2015) “What Do You Like? Early Design Explorations of Sound and Haptic Preferences”, to appear in Proceedings of MobileHCI 2015 Posters, Aug 24-27, Copenhagen, Denmark.
- Wilson, G., Brewster, S., Caltenco, H., Magnusson, C., Finocchietti, S., Baud-Bovy, G. & Gori, M. (2015) “Effects of Sound Type on Recreating the Trajectory of a Moving Source”, Proceedings of CHI 2015 Extended Abstracts, Apr 18-23, Seoul, South Korea.