de
en
Schliessen
Detailsuche
Bibliotheken
Projekt
Impressum
Datenschutz
zum Inhalt
Detailsuche
Schnellsuche:
OK
Ergebnisliste
Titel
Titel
Inhalt
Inhalt
Seite
Seite
Im Dokument suchen
Fritsch, Jan Nikolaus: Towards gestural understanding for intelligent robots. 2012
Inhalt
Abstract
Contents
1 Motivation
1.1 Background
1.2 Aim
1.3 Robot Skills Needed for Gestural Understanding
1.4 Functionalities for Realizing Gesture Understanding
1.4.1 Detection of the Hand
1.4.2 Tracking of the Hand
1.4.3 Recognition of the Gesture
1.4.4 Incorporating Context for Understanding the Gesture
1.4.5 Beyond Communication: Recognizing Manipulative Actions
1.5 Terminology
1.6 Organization of the Book
2 Gestures in Human-Robot Interaction
2.1 Categorizations of Gestures
2.1.1 General Categorizations
2.1.2 Gestures in Human-Computer Interaction
2.1.3 A Gesture Categorization for Human-Robot Interaction
2.1.4 Selected Categories for Creating Intelligent Robots
2.1.5 The Influence of Context on Gesture Understanding
2.2 Sensing Devices for Observing Gesturing Humans
2.2.1 Intrusive Sensing Methods
2.2.2 Active Sensors
2.2.3 Vision-based Sensing Methods
2.2.4 Choosing the Right Sensor
2.3 Design Decisions for Gesture Understanding Systems
2.3.1 Graylevel vs. Color Images
2.3.2 Data-driven vs. Model-driven Processing(Bottom-up vs. Top-down)
2.3.3 Modular vs. Holistic Approaches
2.3.4 Gesture Recognition vs. Hand Detection/Recognition
2.4 Summary
3 Detection of the Hand
3.1 Hand Detection vs. Posture Recognition
3.2 Modeling the Hand's Visual Features
3.2.1 Hand Shape
3.2.2 Skin Color
3.2.3 Hand Appearance
3.3 Model-based Hand Detection
3.3.1 Explicit Modeling of the Hand Constituents
3.3.2 Holistic Models of the Hand
3.4 Summary and Conclusion
4 Tracking of the Hand
4.1 Detection vs. Adaptation
4.2 Tracking based on Hand Detection
4.2.1 Kalman Filtering
4.2.2 Particle Filtering
4.3 Adaptive Visual Features for Hand Tracking
4.3.1 Adaptive Hand Color
4.3.2 Adaptive Hand Shape
4.3.3 Adaptive Hand Appearance
4.4 Example: Detecting and Tracking Hands Based on Skin Color
4.4.1 System Overview
4.4.2 Modeling Skin Color Distribution and Skin Locus
4.4.3 Applying Skin Color Segmentation
4.4.4 Updating the Skin Color Model
4.4.5 Evaluation Results
4.5 Model-based Approaches to Hand Tracking
4.5.1 Model-based Tracking of Hand Configurations
4.5.2 Model-based Tracking of Arm/Body Configurations
4.6 Example: Tracking 3D Human Body Configurations
4.6.1 System Overview
4.6.2 Modeling the Appearance of Humans
4.6.3 Tracking Multiple Body Configuration Hypotheses
4.6.4 Evaluation Results
4.7 Summary and Conclusion
5 Recognition of the Gesture
5.1 Holistic Methods Applying Implicit Models of Hand Gestures
5.1.1 Single-Frame Gesture Recognition
5.1.2 Holistic Temporal Models
5.2 Modular Methods for Matching Trajectories: General Design
5.3 Deterministic Matching Methods
5.3.1 Direct Comparison
5.3.2 Dynamic Time Warping
5.3.3 Modeling Sequences of Atomic Gestures
5.4 Probabilistic Approaches for Trajectory Matching
5.4.1 Hidden-Markov-Models for Trajectory Recognition
5.4.2 Hierarchical HMMs and Dynamic Bayesian Networks
5.4.3 Particle Filtering for Trajectory Recognition
5.4.4 Trajectory Matching Approaches employing Neural Networks
5.5 Example: Trajectory-Based Recognition of Pointing Gestures
5.5.1 System Overview
5.5.2 Recognizing Pointing Gestures
5.5.3 Evaluation Results
5.6 Summary and Conclusion
6 Incorporating Context for Understanding the Gesture
6.1 Incorporating User-Provided Context for Pointing Gestures
6.1.1 Posture Information Restricting the Object Search Space
6.1.2 Verbal Information Complementing Pointing Gestures
6.1.3 Verbal Information Specifying Object Properties
6.2 Example: Including Verbal Cues for Resolving Object References
6.2.1 System Overview
6.2.2 Finding Previously Known Objects
6.2.3 Learning Views of Unknown Objects
6.2.4 Evaluation Results
6.3 Incorporating Situational Context for Manipulative Gestures
6.3.1 Body-Centered Action Recognition
6.3.2 Object-Centered Action Recognition
6.3.3 Parallel Approaches Combining Objects and Gestures
6.3.4 Holistic Approaches to Action Recognition
6.4 Example: A Fusion Approach for Recognizing Manipulative Actions
6.4.1 System Overview
6.4.2 Inferring Process
6.4.3 Evaluation Results
6.5 Summary and Conclusion
7 Robots Exhibiting Gesture Understanding Capabilities
7.1 Robots Understanding Communicative Gestures
7.1.1 Symbolic and Conventional Gestures
7.1.2 Referential and Pointing Gestures
7.2 Robots Understanding Manipulative Actions
7.2.1 Imitating the Observed Motion
7.2.2 Understanding for Reacting to the Environment Manipulation
7.3 Summary and Conclusion
8 Towards Learning of Gestures:Studies on Human Gesture Understanding
8.1 Limits of Classical Approaches to Learning
8.2 Modifications of Child-directed Motions: Motionese
8.3 Technical Analysis of Motionese
8.4 Studying Gestural Interaction between Humans and a Robot
8.5 Summary and Conclusion
9 Conclusion
References