We present a performance analysis of three linear dimensionality reduction techniques: Fisher's discriminant analysis (FDA), and two methods introduced recently based on the Chernoff distance between two distributions, the Loog and Duin (LD) method, which aims to maximize a criterion derived from the Chernoff distance in the original space, and the one introduced by Rueda and Herrera (RH), which aims to maximize the Chernoff distance in the transformed space. A comprehensive performance analysis of these methods combined with two well-known classifiers, linear and quadratic, on synthetic and real-life data shows that LD and RH outperform FDA, specially in the quadratic classifier, which is strongly related to the Chernoff distance in the transformed space. In the case of the linear classifier, the superiority of RH over the other two methods is also demonstrated. © Springer-Verlag Berlin Heidelberg 2006.
CITATION STYLE
Ali, M. L., Rueda, L., & Herrera, M. (2006). On the performance of Chernoff-distance-based linear dimensionality reduction techniques. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4013 LNAI, pp. 467–478). Springer Verlag. https://doi.org/10.1007/11766247_40
Mendeley helps you to discover research relevant for your work.