Comparison of a novel combined ECOC strategy with different multiclass algorithms together with parameter optimization methods

5Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper we consider multiclass learning tasks based on Support Vector Machines (SVMs). In this regard, currently used methods are One-Against-All or One-Against-One, but there is much need for improvements in the field of multiclass learning. We developed a novel combination algorithm called Comb-ECOC, which is based on posterior class probabilities. It assigns, according to the Bayesian rule, the respective instance to the class with the highest posterior probability. A problem with the usage of a multiclass method is the proper choice of parameters. Many users only take the default parameters of the respective learning algorithms (e.g. the regularizaron parameter C and the kernel parameter γ). We tested different parameter optimization methods on different learning algorithms and confirmed the better performance of One-Against-One versus One-Against-All, which can be explained by the maximum margin approach of SVMs. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Hülsmann, M., & Friedrich, C. M. (2007). Comparison of a novel combined ECOC strategy with different multiclass algorithms together with parameter optimization methods. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4571 LNAI, pp. 17–31). Springer Verlag. https://doi.org/10.1007/978-3-540-73499-4_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free