TY - EDBOOK AB - This paper introduces a novel sonification-based interaction support for cooperating users in an Augmented Reality setting. When using head-mounted AR displays, the field of view is limited which causes users to miss important activities such as object interactions or deictic references of their interaction partner to (re-)establish joint attention. We introduce an interactive sonification which makes object manipulations of both interaction partners mutually transparent by sounds that convey information about the kind of activity, and which can optionally even identify the object itself. In this paper we focus on the sonification method, interaction design and sound design, and we furthermore render the sonification both from sensor data (e.g. object tracking) and manual annotations. As a spin-off of our approach we propose this method further for the enhancement of interaction observation, data analysis, and multimodal annotation in interactional linguistics and conversation analysis. DA - 2013 LA - eng PY - 2013 SN - 978-83-7283-546-48 TI - Interactive Sonification of Collaborative AR-based Planning Tasks for Enhancing Joint Attention UR - https://nbn-resolving.org/urn:nbn:de:0070-pub-26145682 Y2 - 2024-11-22T17:56:20 ER -