Adaptive hausdorff distances and tangent distance adaptation for transformation invariant classification learning

4Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Tangent distances (TDs) are important concepts for data manifold distance description in machine learning. In this paper we show that the Hausdorff distance is equivalent to the TD for certain conditions. Hence, we prove the metric properties for TDs. Thereafter, we consider those TDs as dissimilarity measure in learning vector quantization (LVQ) for classification learning of class distributions with high variability. Particularly, we integrate the TD in the learning scheme of LVQ to obtain a TD adaption during LVQ learning. The TD approach extends the classical prototype concept to affine subspaces. This leads to a high topological richness compared to prototypes as points in the data space. By the manifold theory of TDs we can ensure that the affine subspaces are aligned in directions of invariant transformations with respect to class discrimination. We demonstrate the superiority of this new approach by two examples.

Cite

CITATION STYLE

APA

Saralajew, S., Nebel, D., & Villmann, T. (2016). Adaptive hausdorff distances and tangent distance adaptation for transformation invariant classification learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9949 LNCS, pp. 362–371). Springer Verlag. https://doi.org/10.1007/978-3-319-46675-0_40

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free