Coefficient structure of kernel perceptrons and support vector reduction

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Support Vector Machines (SVMs) with few support vectors are quite desirable, as they have a fast application to new, unseen patterns. In this work we shall study the coefficient structure of the dual representation of SVMs constructed for nonlinearly separable problems through kernel perceptron training. We shall relate them with the margin of their support vectors (SVs) and also with the number of iterations in which these SVs take part. These considerations will lead to a remove-and-retrain procedure for building SVMs with a small number of SVs where both suitably small and large coefficient SVs will be taken out from the training sample. Besides providing a significant SV reduction, our method's computational cost is comparable to that of a single SVM training. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

García, D., González, A., & Dorronsoro, J. R. (2007). Coefficient structure of kernel perceptrons and support vector reduction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4527 LNCS, pp. 337–345). Springer Verlag. https://doi.org/10.1007/978-3-540-73053-8_34

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free