de
en
Schliessen
Detailsuche
Bibliotheken
Projekt
Impressum
Datenschutz
zum Inhalt
Detailsuche
Schnellsuche:
OK
Ergebnisliste
Titel
Titel
Inhalt
Inhalt
Seite
Seite
Im Dokument suchen
Schillingmann, Lars: A computational model of acoustic packaging. 2012
Inhalt
Motivation
Event and Action Segmentation
Experimental Methods to Investigate Action Segmentation
Representation and Memory of Meaningful Event Units
Humans Segment Action into Variably Sized Units
Humans Organize Action Segments Hierarchically
Features Used for Event and Action Segmentation
Perceptual Mechanisms in Event and Action Segmentation
Conclusion
Multimodal Processing and Acoustic Packaging
Multimodal Processing and Integration
Early and Late Integration
The Intersensory Redundancy Hypothesis
Auditory Dominance
Acoustic Packaging
A Coalition Model of Language Comprehension
The Emergentist Coalition Model
Evidence for Acoustic Packaging
Conclusion
A Computational Model of Acoustic Packaging
Scenario and Task Overview
Related Work
Acoustic Segmentation
Temporal Visual Segmentation
Multimodal Event Detection and Segmentation
Insights from Human-Robot Teaching Scenarios
Summary
The Acoustic Packaging System
Requirements
System Overview
Acoustic Segmentation
Visual Action Segmentation
Temporal Association
Visualization and Inspection
Conclusion
Acoustic Packaging as Analysis Tool for Multimodal Interaction
How can Acoustic Packaging be Evaluated?
Evaluation of Acoustic Packaging on Adult-Adult and Adult-Child Interaction Data
Corpus Overview
Procedure
Evaluation Results
Discussion
Analysis of Adult-Adult and Adult-Child Interaction
Corpus Overview
Procedure and Design
Results on Individual Modalities
Results on the Number of Acoustic Packages per Interaction
Results on the Amount of Motion Peaks per Acoustic Package
Discussion
Analysis of Human Robot Interaction
Corpus Overview
Procedure and Design
Results on Individual Modalities
Results on the Number and Total Length of Acoustic Packages
Results on the Amount of Motion Peaks per Acoustic Package
Discussion
Conclusion
Acoustic Packaging as a Basis for Feedback on the iCub Robot
Color Saliency Based Tracking
Color Vision in Infants
Design Rationale and Requirements
The Color Saliency Based Tracking Module
Evaluation
Summary
Prominence Detection
Perceptual Prominence
The Prominence Detection Module
Evaluation
Summary
Integration of Color Saliency and Prominence Detection into the Acoustic Packaging System
Additions to the Existing System Components
Acoustic Packaging as a Basis for Feedback on the iCub Robot
Summary
Analysis of Local Synchrony within Acoustic Packages
Procedure
Prominent Words in Acoustic Packages
Relationship Color Adjectives with Motion Trajectories
Conclusion
Summary
A Roadmap to Multimodal Action and Language Learning in Interaction
Representation of Action Perception and Action Production in Acoustic Packages
Roadmap Overview
Handling More Cues
Filtering and Optimizing the Action Representation based on Acoustic Packages
Recognizing Repetitions in the Action Representation
Constructing Larger Structures Grounded in Language and Vision
Using Linguistic Relationships in Speech for Action Segmentation
Feedback Strategies
Initial Interaction Loop
Conclusion
Conclusion
Additional Evaluation Results on Adult-Adult and Adult-Child Interaction