de
en
Schliessen
Detailsuche
Bibliotheken
Projekt
Impressum
Datenschutz
zum Inhalt
Detailsuche
Schnellsuche:
OK
Ergebnisliste
Titel
Titel
Inhalt
Inhalt
Seite
Seite
Im Dokument suchen
Klanke, Stefan: Learning manifolds with the Parametrized Self-Organizing Map and Unsupervised Kernel Regression. 2007
Inhalt
Introduction
Motivation
Outline and contributions
Supervised learning
Statistical model
Empirical error minimization and regularization
Bias variance dilemma
Model selection
Unsupervised learning
Dimension reduction and manifold learning
Further concepts
Density estimation
Parametric vs. non-parametric methods
Maximum likelihood and Bayesian estimation
The ``kernel trick''
Related Methods
Vector quantization and clustering
K-Means clustering
Clustering with mixture models
K-Harmonic Means
Further algorithms
Principal Component Analysis
Probabilistic PCA
Local PCA and mixture models
Kernel PCA
Auto-associative neural networks
The Self-Organizing Map
Variants of the SOM
Principal curves
Generative model
Polygonal lines
K-segments and local PCA
Principal surfaces
The Generative Topographic Mapping
Regularized Principal Manifolds
Further methods
Pointwise embedding methods
Multi-dimensional scaling
The Sammon mapping
Curvilinear Component Analysis
Curvilinear Distance Analysis
Nonlinear spectral embedding methods
Isomap
Locally Linear Embedding
Maximum Variance Unfolding
Further methods and discussion
The Parametrized Self-Organizing Map and extensions
Original formulation
Chebyshev PSOMs and local PSOMs
Application in kinematics learning
PSOM+ extensions
Explicit smoothness measure
Noisy data
Missing data
Per-weight smoothing
Non-grid-organized data
PSOM+ model of PA-10 kinematics
Unsupervised learning of manifolds with the PSOM+
Discussion
Unsupervised Kernel Regression
The Nadaraya-Watson estimator
Choice of smoothing parameter
Multivariate generalization and further kernels
Derivation of UKR
UKR manifold and generalization
Objective function
Regularization approaches
Extension of latent space
Density in latent space
Leave-one-out cross-validation
Optimizing a UKR model
Gradient of the reconstruction error
Spectral initialization
Homotopy-based optimization
Projection of new data
Summary of the procedure
Feature space variant
The L1-norm kernel
Experiments
``Noisy spiral'' data
Homotopy and penalty terms: S-shaped triangle data
USPS handwritten digits
"Oil flow" data
Discussion
Extensions to Unsupervised Kernel Regression
UKR with general loss functions
Loss functions used in this work
Including general loss functions with UKR
Optimization scheme for the -insensitive loss
Experiments
Relation to feature space UKR
Discussion
UKR with leave-K-out cross-validation
A leave-K-out partitioning scheme
How to get smooth borders
Experiments
Discussion
Landmark UKR
Reconstruction from landmark points
Landmark adaption and smoothness control
Experiments
Discussion
Unsupervised Local Polynomial Regression
Local polynomial regression
Derivation of ULPR
Experiments
Discussion
Conclusion
Mathematical notation
UKR gradient and computational complexity
Derivative calculations for Unsupervised Local Polynomial Regression
Gradient of the reconstruction error
Local Constant Estimate
Local Linear Estimate
Local Quadratic Estimate (w/o cross-terms)
Local Quadratic Estimate (q=2)
Gradient with respect to the scale
Gradient calculation for the landmark variant
Jacobi matrix for projecting new data
Local constant estimate
Local linear estimate
Local quadratic estimate w/o cross-terms
Local quadratic estimate with cross-term, q=2
References