TY - JOUR AB - Communicating face-to-face, interlocutors frequently produce multimodal meaning packages consisting of speech and accompanying gestures. We discuss a systematically annotated speech and gesture corpus consisting of 25 route-and-landmark-description dialogues, the Bielefeld Speech and Gesture Alignment corpus (SaGA), collected in experimental face-to-face settings. We first describe the primary and secondary data of the corpus and its reliability assessment. Then we go into some of the projects carried out using SaGA demonstrating the wide range of its usability: on the empirical side, there is work on gesture typology, individual and contextual parameters influencing gesture production and gestures’ functions for dialogue structure. Speech-gesture interfaces have been established extending unification-based grammars. In addition, the development of a computational model of speech-gesture alignment and its implementation constitutes a research line we focus on. DA - 2013 DO - 10.1007/s12193-012-0106-8 KW - Multimodal dialogue KW - Iconic gesture KW - Multimodal simulation KW - Multimodal data KW - Speech-and-gesture alignment LA - eng IS - 1-2 M2 - 5 PY - 2013 SN - 1783-7677 SP - 5-18 T2 - Journal on Multimodal User Interfaces TI - Data-based analysis of speech and gesture: the Bielefeld Speech and Gesture Alignment corpus (SaGA) and its applications UR - https://nbn-resolving.org/urn:nbn:de:0070-pub-25222997 Y2 - 2024-11-21T21:01:41 ER -