Multi-class support vector machine simplification

2Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution - support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy. © 2008 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Nguyen, D., Matsumoto, K., Hashimoto, K., Takishima, Y., Takatori, D., & Terabe, M. (2008). Multi-class support vector machine simplification. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5351 LNAI, pp. 799–808). https://doi.org/10.1007/978-3-540-89197-0_74

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free