TY - BOOK AB - This paper presents and evaluates auditory representations for object interactions as support for cooperating users of an Augmented Reality (AR) system. Particularly head-mounted AR displays limit the field of view and thus cause users to miss relevant activities of their interaction partner, such as object interactions or deictic references that normally would be effective to establish joint attention. We start from an analysis of the differences between face-to-face interaction and interaction via the AR system, using interaction linguistic conversation analysis. From that we derive a set of features that are relevant for interaction partners to co-ordinate their activities. We then present five different interactive sonifications which make object manipulations of interaction partners audible by sonification that convey information about the kind of activity. In this paper we furthermore evaluate our designs in a study where participants observe an interaction episode and rate features of the sonification in questionnaires. In result, we can conclude insights into factors for acceptable sonifications to support dyadic interaction. DA - 2013 DO - 10.1145/2544114.2597651 LA - eng PY - 2013 SN - 978-1-4503-2659-9 TI - Sonification for Supporting Joint Attention in Dyadic Augmented Reality-based Cooperations UR - https://nbn-resolving.org/urn:nbn:de:0070-pub-26049578 Y2 - 2024-11-22T09:59:58 ER -