TY - BOOK AB - In this paper we present our humanoid robot “Meka”, partici- pating in a multi party human robot dialogue scenario. Active arbitration of the robot's attention based-on multi-modal stim- uli is utilised to attain persons which are outside of the robots field of view. We investigate the impact of this attention management and an addressee recognition on the robot's capability to distinguish utterances directed at it from communication between humans. Based on the results of a user study, we show that mutual gaze at the end of an utterance, as a means of yielding a turn, is a substantial cue for addressee recognition. Verification of a speaker through the detection of lip movements can be used to further increase precision. Further- more, we show that even a rather simplistic fusion of gaze and lip movement cues allows a considerable enhancement in addressee estimation, and can be altered to adapt to the requirements of a particular scenario. DA - 2016 DO - 10.1145/2974804.2974823 LA - eng PY - 2016 TI - Are you talking to me? Improving the robustness of dialogue systems in a multi party HRI scenario by incorporating gaze direction and lip movement of attendees UR - https://nbn-resolving.org/urn:nbn:de:0070-pub-29046115 Y2 - 2024-11-24T04:36:29 ER -