We investigate how technology can empower citizens and non-state actors to co-create and take an active role in shaping agendas
We investigate the use of social robots to create inclusive mix-visual ability classrooms.
We investigate the use of tangible systems to promote computational thinking skills in mixed-ability children.
ARCADE proposes leveraging interactive and digital technologies to create context-aware workspaces to improve physical rehabilitation practices.
AVATAR proposes creating a signing 3D avatar able to synthesize Portuguese Sign Language.
Although text-entry is an inherently visually demanding task, we are creating novel non-visual input methods to multiple form-factors: from tablets to smartwatches.
Braille 21 is an umbrella term for a series of research projects that aim to bring Braille to the 21st century. Our goal is to facilitate access to Braille in the new digital era.
In this project, we are creating the tools to characterize user performance in the wild and improve current everyday devices and interfaces.
We investigate novel interfaces and interaction techniques for nonvisual word completion. We are particularly interested in quantifying the benefits and costs of such new solutions.
As touchscreens have evolved to provide multitouch capabilities, we are exploring new multi-point feedback solutions.
In this research work, we are investigating novel interactive applications that leverage the use of concurrent speech to improve users' experiences.
This research leverages mobile and wearable technologies to improve classroom accessibility for Deaf and Hard of Hearing college students.
Our goal is to thoroughly study mobile touchscreen interfaces, their characteristics and parameterizations, thus providing the tools for informed interface design.
This project investigates how accurate tracking systems and engaging activities can be leveraged to provide effective evaluation procedures in physical rehabilitation.
We aim to understand the overlap of problems faced by health and situational impaired users when using their mobile devices and design solutions for both user groups.
Inclusion is key in group work and collaborative learning. We devel- oped a mediator robot to support and promote inclusion in group conversations, particularly in groups composed of children with and without visual impairment. We investigate the effect of two mediation strategies on group dynamics, inclusion, and perception of the robot. We conducted a within-subjects study with 78 children, 26 experienced visual impairments, in a decision-making activity. Results indicate that the robot can foster inclusion in mixed-visual ability group conversations. The robot succeeds in balancing par- ticipation, particularly when using a highly intervening mediating strategy (directive strategy). However, children feel more heard by their peers when the robot is less intervening (organic strategy). We extend prior work on social robots to assist group work and contribute with a mediator robot that enables children with visual impairments to engage equally in group conversations. We finish by discussing design implications for inclusive social robots.
There is an increasing interest in HCI in multisensory interactive systems, creating a need to deepen our understanding of how mul- tiple sensory modalities relate to and afect each other. We extend prior research on crossmodal correspondences by investigating colour and emotional associations with pure vibrotactile stimuli. We asked 32 participants to assign colour properties (hue and bright- ness) and emotional categories (pleasure, arousal, and dominance) to stimuli of varying amplitude and angular frequency. We found that perceptions of pleasure and arousal increased with amplitude and angular frequency, but negative and relaxing emotional states were challenging to convey. We also found associations between vibrotactile stimuli and colour properties. High amplitude stimuli were linked to warm colours and darker shades, while low am- plitude was associated with brighter shades and greater variance in hue. We fnish by discussing the causal mechanisms of cross- modal correspondences and contribute a design space for creating multisensory experiences.
We present LocomotiVR, a Virtual Reality tool designed with phys- iotherapists to improve the gait rehabilitation in clinical practice. The tool features two interfaces: a VR environment to immerse the patient in the therapy activity; and a desktop tool operated by a physiotherapist to customize exercises and follow the patient’s per- formance. Results revealed that LocomotiVR presented promising acceptability, usage, and engagement scores. These results were sup- ported by qualitative data collected from participating experts, which discussed high levels of satisfaction, motivation, and acceptance to incorporate the LocomotiVR in daily therapy practices. Concerns were related to patient safety and lack of legal regulation.
Collaborative coding environments foster learning, social skills, computational thinking training, and supportive relationships. In the context of inclusive education, these environments have the potential to promote inclusive learning activities for children with mixed-visual abilities. However, there is limited research focusing on remote collaborative environments, despite the opportunity to design new modes of access and control of content to promote more equitable learning experiences. We investigated the tradeoffs between remote and co-located collaboration through a tangible coding kit. We asked ten pairs of mixed-visual ability children to collaborate in an interdependent and asymmetric coding game. We contribute insights on six dimensions - effectiveness, computational thinking, accessibility, communication, cooperation, and engagement - and reflect on differences, challenges, and advantages between collaborative settings related to communication, workspace awareness, and computational thinking training. Lastly, we discuss design opportunities of tangibles, audio, roles, and tasks to create inclusive learning activities in remote and co-located settings
Dissociative Identity Disorder (DID) is characterized by the presence of at least two distinct identities in the same individual. This paper describes a co-design process with a person living with DID. We first aimed to uncover the main challenges experienced by the co-designer as well as design opportunities for novel technologies. We then engaged in a prototyping stage to design a wearable display (WhoDID) to facilitate in-person social interactions. The prototype aims to be used as a necklace and enable the user to make their fronting personality visible to others. Thus, facilitating social encounters or sudden changes of identity. We reflect on the design features of WhoDID in the broader context of supporting people with DID. Moreover, we provide insights on co-designing with someone with multiple (sometimes conflicting) personalities regarding requirement elicitation, decision-making, prototyping, and ethics. To our knowledge, we report the first design process with a DID user within the ASSETS and CHI communities. We aim to encourage other assistive technology researchers to design with DID users.
Storytelling has the potential to be an inclusive and collaborative activity. However, it is unclear how interactive storytelling systems can support such activities, particularly when considering mixed-visual ability children. In this paper, we present an interactive multisensory storytelling system and explore the extent to which an emotional robot can be used to support inclusive experiences. We investigate the effect of the robot’s emotional behavior on the joint storytelling process, resulting narratives, and collaboration dynamics. Results show that when children co-create stories with a robot that exhibits emotional behaviors, they include more emotive elements in their stories and explicitly accept more ideas from their peers. We contribute with a multisensory environment that enables children with visual impairments to engage in joint storytelling activities with their peers and analyze the effect of a robot’s emotional behaviors on an inclusive storytelling experience.