TY - JOUR AB - Modern nonlinear dimensionality reduction (DR) techniques project high dimensional data to low dimensions for their visual inspection. Provided the intrinsic data dimensionality is larger than two, DR nec- essarily faces information loss and the problem becomes ill-posed. Dis- criminative dimensionality reduction (DiDi) offers one intuitive way to reduce this ambiguity: it allows a practitioner to identify what is relevant and what should be regarded as noise by means of intuitive auxiliary information such as class labels. One powerful DiDi method relies on a change of the data metric based on the Fisher information. This technique has been presented for vectorial data so far. The aim of this contribution is to extend the technique to more general data structures which are characterised in terms of pairwise similarities only by means of a kernelisation. We demonstrate that a computation of the Fisher metric is possible in kernel space, and that it can efficiently be integrated into modern DR technologies such as t-SNE or faster Barnes-Hut-SNE. We demonstrate the performance of the approach in a variety of benchmarks. DA - 2017 DO - 10.1016/j.neucom.2017.01.104 LA - eng IS - SI M2 - 34 PY - 2017 SN - 0925-2312 SP - 34-41 T2 - Neurocomputing TI - Efficient Kernelization of Discriminative Dimensionality Reduction UR - https://nbn-resolving.org/urn:nbn:de:0070-pub-29093721 Y2 - 2024-11-22T04:33:43 ER -