Blind people rely mostly on screen readers to consume digital information, relying on sequential auditory feedback, which impairs a quick overview of the content.
We propose taking advantage of the Cocktail Party Effect, which states that people are able to focus their attention on a single voice among several conversations. In fact, research has shown that concurrent speech is able to speed-up the consumption of digital information. In this project, we are investigating novel interactive applications that leverage the use of concurrent speech to improve users’ experiences with their devices.
Title: Applications for Concurrent Speech
Date: Jan 1, 2017
Authors: João Guerreiro, Hugo Nicolau, Tiago Guerreiro, Kyle Montague
Keywords: concurrent speech, 3d audio, blind, touchscren
Word prediction can significantly improve text-entry rates on mobile touchscreen devices. However, these interactions are inherently visual and require constant scanning for new word predictions to actually take advantage of the suggestions. In this paper, we discuss the design space for non-visual word prediction interfaces and finally present Shout-out Suggestions, a novel interface to provide non-visual access to word predictions on existing mobile devices.
Interaction with large touch surfaces is still a relatively infant domain, particularly when looking at the accessibility solutions offered to blind users. Their smaller mobile counterparts are shipped with built-in accessibility features, enabling non-visual exploration of linearized screen content. However, it is unknown how well these solutions perform in large interactive surfaces that use more complex spatial content layouts. We report on a user study with 14 blind participants performing common touchscreen interactions using one and two-hand exploration. We investigate the exploration strategies applied by blind users when interacting with a tabletop. We identified six basic strategies that were commonly adopted and should be considered in future designs. We finish with implications for the design of accessible large touch interfaces.
Tablet devices can display full-size QWERTY keyboards similar to the physical ones. Yet, the lack of tactile feedback and the inability to rest the fingers on the home keys result in a highly demanding and slow exploration task for blind users. We present SpatialTouch, an input system that leverages previous experience with physical QWERTY keyboards, by supporting two-handed interaction through multitouch exploration and spatial, simultaneous audio feedback. We conducted a user study, with 30 novice touchscreen participants entering text under one of two conditions: (1) SpatialTouch or (2) mainstream accessibility method Explore by Touch. We show that SpatialTouch enables blind users to leverage previous experience as they do a better use of home keys and perform more efficient exploration paths. Results suggest that although SpatialTouch did not result in faster input rates overall, it was indeed able to leverage previous QWERTY experience in contrast to Explore by Touch.