de
en
Schliessen
Detailsuche
Bibliotheken
Projekt
Impressum
Datenschutz
zum Inhalt
Detailsuche
Schnellsuche:
OK
Ergebnisliste
Titel
Titel
Inhalt
Inhalt
Seite
Seite
Im Dokument suchen
Schulz, Alexander: Discriminative dimensionality reduction: variations, applications, interpretations. 2017
Inhalt
Introduction
Motivation
Scientific contributions and structural overview
Publications in the context of this thesis
Discriminative dimensionality reduction
Motivation
Scientific contributions and structure of the chapter
Kernel t-SNE
T-distributed stochastic neighbor embedding (t-SNE)
Assessing the quality of dimensionality reduction mappings
Parametric extension of dimensionality reduction
Illustration
Definition of the Fisher metric
Metrics
Fisher metric as a special case of the Riemannian metric
Approximation of the shortest paths
Example
Discriminative dimensionality reduction for classification tasks
Approximation of the probabilities
Example
Discriminative dimensionality reduction in kernel space
Kernelization
Experiments
Conclusion
Discriminative dimensionality reduction for regression tasks
Gaussian Processes for regression
Estimating the Fisher matrix based on a Gaussian Process
Justification for discriminative DR
Experiments
Conclusion
Discussion
Visualization of functions in high-dimensional spaces
Motivation
Scientific contributions and structure of the chapter
Dimensionality reduction techniques
Inverse dimensionality reduction
General framework
Naive approach
Main procedure
Evaluation
Experiments with classification functions
Experiments with regression functions
Discussion
Interpretation of data mappings
Motivation
Scientific contributions and structure of the chapter
Estimating interpretable components for nonlinear DR
Neighborhood Retrieval Optimizer
Feature selection for DR
Relevance learning for DR
Metric learning for DR
Experiments
Valid interpretation of feature relevance for linear data mappings
Definition and measure of feature relevance
Linear bounds
Metric learning as linear data transformation
Experiments for linear regression
Experiments for metric learning
Discussion
Dimensionality reduction for transfer learning
Motivation
Scientific contributions and structure of the chapter
Transfer learning without given correspondences
Shared linear embedding
Shared nonlinear embedding
Experiments
Discussion
Conclusion
Mathematical derivations
The Fisher information matrix for a discrete auxiliary variable
The Fisher information matrix for a continuous auxiliary variable
Publications in the context of this thesis
Bibliography