For solving tasks cooperatively in close interaction with humans, robots need to have timely updated spatial representations. However, perceptual information about the current position of interaction partners is often late. If robots could anticipate the targets of upcoming manual actions, such as pointing gestures, they would have more time to physically react to human movements and could consider prospective space allocations in their planning. Many findings support a close eye-hand coordination in humans which could be used to predict gestures by observing eye gaze. However, effects vary strongly with the context of the interaction. We collect evidence of eye-hand coordination in a natural route planning scenario in which two agents interact over a map on a table. In particular, we are interested if fixations can predict pointing targets and how target distances affect the interlocutor's pointing behavior. We present an automatic method combining marker tracking and 3D modeling that provides eye and gesture measurements in real-time.
Titelaufnahme
Titelaufnahme
- TitelSpatial references with gaze and pointing in shared space of humans and robots
- Verfasser
- Herausgeber
- Enthalten inSpatial Cognition IX, S. 121-136
- Erschienen
- SpracheEnglisch
- DokumenttypAufsatz in einem Sammelwerk
- Schlagwörter
- ISBN978-3-319-11214-5
- URN
- DOI
Zugriffsbeschränkung
- Das Dokument ist frei verfügbar
Links
- Social MediaShare
- NachweisKein Nachweis verfügbar
- IIIF
Dateien
Klassifikation
Abstract
Statistik
- Das PDF-Dokument wurde 8 mal heruntergeladen.