TY - BOOK AB - This paper presents a novel approach for using sound to externalize emotional states so that they become an object for communication and reflection both for the users themselves and for interaction with other users such as peers, parents or therapists. We present an abstract, vocal, and physiology-based sound synthesis model whose sound space each covers various emotional associations. The key idea in our approach is to use an evolutionary optimization approach to enable users to find emotional prototypes which are then in turn fed into a kernel-regression-based mapping to allow users to navigate the sound space via a low-dimensional interface, which can be controlled in a playful way via tablet interactions. The method is intended to be used for supporting people with autism spectrum disorder. DA - 2016 DO - 10.1145/2986416.2986437 KW - Emotions KW - Sound KW - Auditory Display KW - Autism Spectrum Disorder (ASD) LA - eng PY - 2016 TI - EmoSonics – Interactive Sound Interfaces for the Externalization of Emotions UR - https://nbn-resolving.org/urn:nbn:de:0070-pub-29050360 Y2 - 2024-11-22T03:19:47 ER -