TY - EDBOOK AB - We present a cognitively motivated vision architecture for the evaluation of pointing gestures. The system views a scene of several structured objects and a pointing human hand. A neural classifier gives an estimation of the pointing direction, then the object correspondence is established using a sub-symbolic representation of both the scene and the pointing direction. The system achieves high robustness because the result (the indicated location) does not primarily depend on the accuracy of the pointing direction classification. Instead, the scene is analysed for low level saliency features to restrict the set of all possible pointing locations to a subset of highly likely locations. This transformation of the "continuous" to a "discrete" pointing problem simultaneously facilitates an auditory feedback whenever the object reference changes, which leads to a significantly improved human-machine interaction. DA - 2003 LA - eng PY - 2003 SN - 3-540-40408-2 TI - Recognition of Gestural Object Reference with Auditory Feedback UR - https://nbn-resolving.org/urn:nbn:de:0070-pub-27145831 Y2 - 2024-12-25T18:46:59 ER -