Confusion matrix disagreement for multiple classifiers

16Citations
Citations of this article
48Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We present a methodology to analyze Multiple Classifiers Systems (MCS) performance, using the disagreement concept. The goal is to define an alternative approach to the conventional recognition rate criterion, which usually requires an exhaustive combination search. This approach defines a Distance-based Disagreement (DbD) measure using an Euclidean distance computed between confusion matrices and a soft-correlation rule to indicate the most likely candidates to the best classifiers ensemble. As case study, we apply this strategy to two different handwritten recognition systems. Experimental results indicate that the method proposed can be used as a low-cost alternative to conventional approaches. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Freitas, C. O. A., De Carvalho, J. M., Oliveira, J. J., Aires, S. B. K., & Sabourin, R. (2007). Confusion matrix disagreement for multiple classifiers. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4756 LNCS, pp. 387–396). Springer Verlag. https://doi.org/10.1007/978-3-540-76725-1_41

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free