TY - CHAP AB - For solving tasks cooperatively in close interaction with humans, robots need to have timely updated spatial representations. However, perceptual information about the current position of interaction partners is often late. If robots could anticipate the targets of upcoming manual actions, such as pointing gestures, they would have more time to physically react to human movements and could consider prospective space allocations in their planning. Many findings support a close eye-hand coordination in humans which could be used to predict gestures by observing eye gaze. However, effects vary strongly with the context of the interaction. We collect evidence of eye-hand coordination in a natural route planning scenario in which two agents interact over a map on a table. In particular, we are interested if fixations can predict pointing targets and how target distances affect the interlocutor's pointing behavior. We present an automatic method combining marker tracking and 3D modeling that provides eye and gesture measurements in real-time. DO - 10.1007/978-3-319-11215-2_9 KW - gestures KW - robotics KW - eye tracking KW - multimodal interaction LA - eng PY - 2014 SN - 978-3-319-11214-5 SP - 121-136 T3 - Spatial Cognition IX TI - Spatial references with gaze and pointing in shared space of humans and robots UR - https://nbn-resolving.org/urn:nbn:de:0070-pub-26791778 Y2 - 2024-11-22T02:56:55 ER -