Conflict and conflict dynamics are phenomena intertwined with social change. The ability to detect conflict it is as important as the ability to resolve conflicts effectively because conflicts can bring attention to the problematic structures in a society.Conflict happens at different levels of social interactions, and despite the vast research on conflict in the social sciences, it is not clear what actually happens during this multi-level process. What makes some fights break out and others subside, for example, is difficult to explain unequivocally.
At the base of our approach is that dyadic forms of conflict encompass cognitive, affective and behavioural dimensions that cannot be taken singly, when considering any form of social conflict. In our research we have been looking at small details (deep data) in the interactions with the intent to build agents, that populate agents societies, able to behave naturally when confronted with interpersonal conflict.
Capturing High-Level Deep Data: Conflict Probes
To learn more about the conflict phenomena we turned to humans, in particular children, to understand how the process unfolds. In the SIREN project, we collected children’s perspective on the subject by employing an adaptation of cultural probes: the Conflict Probes.
The purpose of this study was to understand the users’ social worlds, in terms of the ways in which children behave at school. By using cultural probes, we expected to gain insight into these practices without being too close and interfering in the ways that children interact with each other. In that sense, we expected that children would show us how they themselves appraise their social environment, encouraging us, as designers, to step back from our preconceived notions of their reality, and to identify novel and surprising aspects of children’s lives.
Capturing Low-Level Deep Data: The Game-Of-Nines
GON dataset (Game-Of-Nines dataset) consists in 22 annotated interactions between children dyads, aged 10-12 years-old, playing the “Game-Of-Nines”. The interactions were annotated in terms of Emotions, Social Signals, Action Tendencies and Gaze.
The interactions were videotaped and audio-recorded and posteriorly annotated by two psychologists. We make available, by request, the annotation files and all the information necessary to use them.