We investigate the use of social robots to create inclusive mix-visual ability classrooms.
We investigate the use of tangible systems to promote computational thinking skills in mixed-ability children.
AVATAR proposes creating a signing 3D avatar able to synthesize Portuguese Sign Language.
ARCADE proposes leveraging interactive and digital technologies to create context-aware workspaces to improve physical rehabilitation practices.
Although text-entry is an inherently visually demanding task, we are creating novel non-visual input methods to multiple form-factors: from tablets to smartwatches.
Braille 21 is an umbrella term for a series of research projects that aim to bring Braille to the 21st century. Our goal is to facilitate access to Braille in the new digital era.
In this project, we are creating the tools to characterize user performance in the wild and improve current everyday devices and interfaces.
We investigate novel interfaces and interaction techniques for nonvisual word completion. We are particularly interested in quantifying the benefits and costs of such new solutions.
As touchscreens have evolved to provide multitouch capabilities, we are exploring new multi-point feedback solutions.
In this research work, we are investigating novel interactive applications that leverage the use of concurrent speech to improve users' experiences.
This research leverages mobile and wearable technologies to improve classroom accessibility for Deaf and Hard of Hearing college students.
Our goal is to thoroughly study mobile touchscreen interfaces, their characteristics and parameterizations, thus providing the tools for informed interface design.
This project investigates how accurate tracking systems and engaging activities can be leveraged to provide effective evaluation procedures in physical rehabilitation.
We aim to understand the overlap of problems faced by health and situational impaired users when using their mobile devices and design solutions for both user groups.
Storytelling has the potential to be an inclusive and collaborative activity. However, it is unclear how interactive storytelling systems can support such activities, particularly when considering mixed-visual ability children. In this paper, we present an interactive multisensory storytelling system and explore the extent to which an emotional robot can be used to support inclusive experiences. We investigate the effect of the robot’s emotional behavior on the joint storytelling process, resulting narratives, and collaboration dynamics. Results show that when children co-create stories with a robot that exhibits emotional behaviors, they include more emotive elements in their stories and explicitly accept more ideas from their peers. We contribute with a multisensory environment that enables children with visual impairments to engage in joint storytelling activities with their peers and analyze the effect of a robot’s emotional behaviors on an inclusive storytelling experience.
Typing on mobile devices is a common and complex task. The act of typing itself thereby encodes rich information, such as the typing method, the context it is performed in, and individual traits of the person typing. Researchers are increasingly using a selection or combination of experience sampling and passive sensing methods in real-world settings to examine typing behaviours. However, there is limited understanding of the effects these methods have on measures of input speed, typing behaviours, compliance, perceived trust and privacy. In this paper, we investigate the tradeoffs of everyday data collection methods. We contribute empirical results from a four-week field study (N=26). Here, participants contributed by transcribing, composing, passively having sentences analyzed and reflecting on their contributions. We present a tradeoff analysis of these data collection methods, discuss their impact on text-entry applications, and contribute a flexible research platform for in the wild text-entry studies.
An uprising trend of Personal Informatics has leveraged mobile applications to help users track their wellbeing; however, these digital solutions focus on quantitative data, lacking the insights provided by qualitative data in paper notebooks. We propose to digitally augment a paper diary to allow both analogue and digital data, bridging the gap between qualitative and quantitative data tracking practices to support better awareness and reflection on health data. As a first case-study, we designed a self-tracking tool to help college students manage their wellbeing by increasing self-awareness and easing help-seeking behaviours. Next, we conducted a longitudinal study to validate the tool’s effectiveness and analyse its acceptability. Results show that our approach helped students by allowing moments of selfreflection and self-awareness. Additionally, our findings suggest that qualitative data is most useful when important events and abrupt changes to wellbeing occur. Preference for paper or digital diaries is highly user-dependent; however, most participants favoured a digital-only tool with notetaking capabilities.
The recent uprising trend of remote approaches to group physical activity has shown how these strategies lack social engagement. Following a user-centred design process grounded on the Playful Experience (PLEX) Framework’s dimensions, we developed an augmentation of video conference-based group exercise to enhance the social dynamics of high-intensity interval training. We conducted a user study (N = 12) to analyse the effect of our approach on the perceived playfulness of the experience, enjoyment, and effort of participants. Results show an increase in the PLEX Framework dimensions of Competition and Sensation. Additionally, our findings suggest positive trends in the participants’ enjoyment and effort, thus raising new design implications related to the design space of videoconference group exercise interfaces.
This pictorial presents the concept and preliminary design of “Tecnico Go!”, an application conceived to support college students during the COVID-19 pandemic. The project applied HCI and user-centred design methods to understand students’ needs, pains, and desires during the pandemic period. The authors collected, unpacked, and reflected on user-centred data, synthesised personas and scenarios via the research through design approach. Synthesis of results points to the concept of “floating students” as a model embracing flexibility in visiting the campus safely and using the facilities when needed. Problem-oriented student journeys were identified and then used to ideate solution-oriented scenarios and a service blueprint to illuminate the continued development of the “Tecnico Go!” mobile application. This pictorial illustrates this process from data collection through customer journeys, user-centred blueprint and wireframes.
Touch data, and in particular text-entry data, has been mostly collected in the laboratory, under controlled conditions. While touch and text-entry data has consistently shown its potential for monitoring and detecting a variety of conditions and impairments, its deployment in-the-wild remains a challenge. In this paper, we present WildKey, an Android keyboard toolkit that allows for the usable deployment of in-the-wild user studies. WildKey is able to analyse text-entry behaviours through implicit and explicit text-entry data collection while ensuring user privacy. We detail each of the WildKey’s components and features, all of the metrics collected, and discuss the steps taken to ensure user privacy and promote compliance.