Support Vector Machines (SVMs) with few support vectors are quite desirable, as they have a fast application to new, unseen patterns. In this work we shall study the coefficient structure of the dual representation of SVMs constructed for nonlinearly separable problems through kernel perceptron training. We shall relate them with the margin of their support vectors (SVs) and also with the number of iterations in which these SVs take part. These considerations will lead to a remove-and-retrain procedure for building SVMs with a small number of SVs where both suitably small and large coefficient SVs will be taken out from the training sample. Besides providing a significant SV reduction, our method's computational cost is comparable to that of a single SVM training. © Springer-Verlag Berlin Heidelberg 2007.
CITATION STYLE
García, D., González, A., & Dorronsoro, J. R. (2007). Coefficient structure of kernel perceptrons and support vector reduction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4527 LNCS, pp. 337–345). Springer Verlag. https://doi.org/10.1007/978-3-540-73053-8_34
Mendeley helps you to discover research relevant for your work.