We present a methodology to analyze Multiple Classifiers Systems (MCS) performance, using the disagreement concept. The goal is to define an alternative approach to the conventional recognition rate criterion, which usually requires an exhaustive combination search. This approach defines a Distance-based Disagreement (DbD) measure using an Euclidean distance computed between confusion matrices and a soft-correlation rule to indicate the most likely candidates to the best classifiers ensemble. As case study, we apply this strategy to two different handwritten recognition systems. Experimental results indicate that the method proposed can be used as a low-cost alternative to conventional approaches. © Springer-Verlag Berlin Heidelberg 2007.
CITATION STYLE
Freitas, C. O. A., De Carvalho, J. M., Oliveira, J. J., Aires, S. B. K., & Sabourin, R. (2007). Confusion matrix disagreement for multiple classifiers. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4756 LNCS, pp. 387–396). Springer Verlag. https://doi.org/10.1007/978-3-540-76725-1_41
Mendeley helps you to discover research relevant for your work.