Beyond Standard Metrics - On the Selection and Combination of Distance Metrics for an Improved Classification of Hyperspectral Data

5Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Training and application of prototype based learning approaches such as Learning Vector Quantization, Radial Basis Function networks, and Supervised Neural Gas require the use of distance metrics to measure the similarities between feature vectors as well as class prototypes. While the Euclidean distance is used in many cases, the highly correlated features within the hyperspectral representation and the high dimensionality itself favor the use of more sophisticated distance metrics. In this paper we first investigate the role of different metrics for successful classification of hyperspectral data sets from real-world classification tasks. Second, it is shown that considerable performance gains can be achieved by a classification system that combines a number of prototype based models trained on differently parametrized divergence measures. Data sets are tested using a number of different combination strategies. © Springer International Publishing Switzerland 2014.

Cite

CITATION STYLE

APA

Knauer, U., Backhaus, A., & Seiffert, U. (2014). Beyond Standard Metrics - On the Selection and Combination of Distance Metrics for an Improved Classification of Hyperspectral Data. In Advances in Intelligent Systems and Computing (Vol. 295, pp. 167–177). Springer Verlag. https://doi.org/10.1007/978-3-319-07695-9_16

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free