Accessibility in the Wild

Accessibility in the Wild

Over the last decades there has been much research on mobile accessibility. However, most of it relates to laboratory settings, producing a snapshot of user performance. Understanding how performance changes over time and how people truly use their mobile devices remains an open question. Having a deeper knowledge of the challenges, frustrations, and overall true user experience is of upmost importance to improve current mobile technologies.

Our Proposal

In this project, we are creating the tools and gathering the knowledge to characterize user performance in the wild (real-world) in order to improve current devices and interfaces that are used everyday.

Project Details

Title: Accessibility in the Wild

Date: Jan 1, 2017

Authors: André Rodrigues, Hugo Nicolau, Kyle Montague, Tiago Guerreiro, João Guerreiro

Keywords: accessibility, mobile, laboratory, in-the-wild, everyday, touchscreen, mobile, performance


Related Publications


    • In-context Q&A to Support Blind People Using Smartphones
    • Rodrigues, André and Montague, Kyle and Nicolau, Hugo and Guerreiro, João and Guerreiro, Tiago
    • Proceedings of the 19th International ACM SIGACCESS Conference on Computers and Accessibility, , 2017
    • [ABSTRACT] [PDF]
    • Blind people face many barriers using smartphones. Still, previous research has been mostly restricted to non-visual gestural interaction, paying little attention to the deeper daily challenges of blind users. To bridge this gap, we conducted a series of workshops with 42 blind participants, uncovering application challenges across all levels of expertise, most of which could only be surpassed through a support network. We propose Hint Me!, a human-powered service that allows blind users to get in-app assistance by posing questions or browsing previously answered questions on a shared knowledge-base. We evaluated the perceived usefulness and acceptance of this approach with six blind people. Participants valued the ability to learn independently and anticipated a series of usages: labeling, layout and feature descriptions, bug workarounds, and learning to accomplish tasks. Creating or browsing questions depends on aspects like privacy, knowledge of respondents and response time, revealing the benefits of a hybrid approach.


    • Investigating Laboratory and Everyday Typing Performance of Blind Users
    • Nicolau, Hugo and Montague, Kyle and Guerreiro, Tiago and Rodrigues, André and Hanson, Vicki L.
    • ACM Trans. Access. Comput., 4:1–4:26, 2017
    • [ABSTRACT] [PDF] [LIBRARY]
    • Over the last decade there have been numerous studies on touchscreen typing by blind people. However, there are no reports about blind users’ everyday typing performance and how it relates to laboratory settings. We conducted a longitudinal study involving five participants to investigate how blind users truly type on their smartphones. For twelve weeks, we collected field data, coupled with eight weekly laboratory sessions. This paper provides a thorough analysis of everyday typing data and its relationship with controlled laboratory assessments. We improve state-of-the-art techniques to obtain intent from field data, and provide insights on real-world performance. Our findings show that users improve over time, even though it is at a slow rate. Substitutions are the most common type of error and have a significant impact on entry rates in both field and laboratory settings. Results show that participants are 1.3-2 times faster when typing during everyday tasks. On the other hand, they are less accurate. We finished by deriving some implications that should inform the design of future virtual keyboard for non-visual input. Moreover, findings should be of interest to keyboard designers and researchers looking to conduct field studies to understand everyday input performance.


    • Typing Performance of Blind Users: An Analysis of Touch Behaviors, Learning Effect, and In-Situ Usage
    • Nicolau, Hugo and Montague, Kyle and Guerreiro, Tiago and Rodrigues, André and Hanson, Vicki L.
    • Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, 273–280, 2015
    • [ABSTRACT] [PDF] [LIBRARY]
    • Non-visual text-entry for people with visual impairments has focused mostly on the comparison of input techniques reporting on performance measures, such as accuracy and speed. While researchers have been able to establish that non-visual input is slow and error prone, there is little understanding on how to improve it. To develop a richer characterization of typing performance, we conducted a longitudinal study with five novice blind users. For eight weeks, we collected in-situ usage data and conducted weekly laboratory assessment sessions. This paper presents a thorough analysis of typing performance that goes beyond traditional aggregated measures of text-entry and reports on character-level errors and touch measures. Our findings show that users improve over time, even though it is at a slow rate (0.3 WPM per week). Substitutions are the most common type of error and have a significant impact on entry rates. In addition to text input data, we analyzed touch behaviors, looking at touch contact points, exploration movements, and lift positions. We provide insights on why and how performance improvements and errors occur. Finally, we derive some implications that should inform the design of future virtual keyboards for non-visual input


    • Getting Smartphones to Talkback: Understanding the Smartphone Adoption Process of Blind Users
    • Rodrigues, André and Montague, Kyle and Nicolau, Hugo and Guerreiro, Tiago
    • Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, 23–32, 2015
    • [ABSTRACT] [PDF] [LIBRARY]
    • The advent of system-wide accessibility services on mainstream touch-based smartphones has been a major point of inclusion for blind and visually impaired people. Ever since, researchers aimed to improve the accessibility of specific tasks, such text-entry and gestural interaction. However, little work aimed to understand and improve the overall accessibility of these devices in real world settings. In this paper, we present an eight-week long study with five novice blind participants where we seek to understand major concerns, expectations, challenges, barriers, and experiences with smartphones. The study included pre-adoption and weekly interviews, weekly controlled task assessments, and in-the wild system-wide usage. Our results show that mastering these devices is an arduous and long task, confirming the users’ initial concerns. We report on accessibility barriers experienced throughout the study, which could not be encountered in task-based laboratorial settings. Finally, we discuss how smartphones are being integrated in everyday activities and highlight the need for better adoption support tools.


    • TinyBlackBox: Supporting Mobile In-The-Wild Studies
    • Montague, Kyle and Rodrigues, André and Nicolau, Hugo and Guerreiro, Tiago
    • Proceedings of the 17th International ACM SIGACCESS Conference on Computers & Accessibility, 379–380, 2015
    • [ABSTRACT] [PDF] [LIBRARY]
    • Most work investigating mobile HCI is carried out within controlled laboratory settings; these spaces are not representative of the real-world environments for which the technology will predominantly be used. The result of which can produce a skewed or inaccurate understanding of interaction behaviors and users’ abilities. While mobile in-the-wild studies provide more realistic representations of technology usage, there are additional challenges to conducting data collection outside of the lab. In this paper we discuss these challenges and present TinyBlackBox, a standalone data collection framework to support mobile in-thewild studies with today’s smartphone and tablet devices.


    • Motor-impaired Touchscreen Interactions in the Wild
    • Montague, Kyle and Nicolau, Hugo and Hanson, Vicki L.
    • Proceedings of the 16th International ACM SIGACCESS Conference on Computers & Accessibility, 123–130, 2014
    • [ABSTRACT] [PDF] [LIBRARY]
    • Touchscreens are pervasive in mainstream technologies; they offer novel user interfaces and exciting gestural interactions. However, to interpret and distinguish between the vast ranges of gestural inputs, the devices require users to consistently perform interactions inline with the predefined location, movement and timing parameters of the gesture recognizers. For people with variable motor abilities, particularly hand tremors, performing these input gestures can be extremely challenging and impose limitations on the possible interactions the user can make with the device. In this paper, we examine touchscreen performance and interaction behaviors of motor-impaired users on mobile devices. The primary goal of this work is to measure and understand the variance of touchscreen interaction performances by people with motor-impairments. We conducted a four-week in-the-wild user study with nine participants using a mobile touchscreen device. A Sudoku stimulus application measured their interaction performance abilities during this time. Our results show that not only does interaction performance vary significantly between users, but also that an individual’s interaction abilities are significantly different between device sessions. Finally, we propose and evaluate the effect of novel tap gesture recognizers to accommodate for individual variances in touchscreen interactions.


    • NavTap: A Long Term Study with Excluded Blind Users
    • Guerreiro, Tiago and Nicolau, Hugo and Jorge, Joaquim and Gonçalves, Daniel
    • Proceedings of the 11th International ACM SIGACCESS Conference on Computers and Accessibility, 99–106, 2009
    • [ABSTRACT] [PDF] [LIBRARY]
    • NavTap is a navigational method that enables blind users to input text in a mobile device by reducing the associated cognitive load. In this paper, we present studies that go beyond a laboratorial setting, exploring the methods’ effectiveness and learnability as well as its influence on the users’ daily lives. Eight blind users participated in designing the prototype (3 weeks) while five took part in the studies along 16 more weeks. Results gathered in controlled weekly sessions and real life usage logs enabled us to better understand NavTap’s advantages and limitations. The method revealed itself both as easy to learn and improve. Indeed, users were able to better control their mobile devices to send SMS and use other tasks that require text input such as managing a phonebook, from day one, in real-life settings. While individual user profiles play an important role in determining their evolution, even less capable users (with ageinduced impairments or cognitive difficulties), were able to perform the assigned tasks (sms, directory) both in the laboratory and in everyday use, showing continuous improvement to their skills. According to interviews, none were able to input text before. Nav-Tap dramatically changed their relation with mobile devices and noticeably improved their social interaction capabilities.


    • NavTap: Um Estudo de Longa Duracao com Utilizadores Cegos portuguese flag Best Student Paper Award
    • Guerreiro, Tiago and Nicolau, Hugo and Jorge, Joaquim and Gonçalves, Daniel
    • Proceedings of the 17th Encontro Portugues de Computacao Grafica (EPCG), , 2009
    • [ABSTRACT] [PDF]
    • NavTap is a navigational method that enables blind users to input text in a mobile device by reducing the associated cognitive load. We present studies that go beyond a laboratorial setting, exploring the methods’ effectiveness and learnability as well as its influence in the users’ daily lives. Eight blind users participated in the prototype’s design (3 weeks) while five took part in the studies along 16 more weeks. All were unable to input text before. Results gathered in controlled weekly sessions and real life interaction logs revealed the method as easy to learn and improve performance, as the users were able to fully control mobile devices in the first contact within real life scenarios. The individual profiles play an important role determining evolution and even less capable users (with age-induced impairments or cognitive difficulties) were able to perform the required tasks, in and out of the laboratory, with continuous improvements. NavTap dramatically changed the users’ relation with the devices and improved their social interaction capabilities.

Social Links

My Office

Instituto Superior Técnico,
Computer Science and Engineering Department,
Room 2-N9.21 - TagusPark
Avenida Professor Cavaco Silva,
2744-016 Porto Salvo,
Portugal