It has been shown that many kernel methods can be equivalently formulated as minimal enclosing ball (MEB) problems in a certain feature space. Exploiting this reduction, efficient algorithms to scale up Support Vector Machines (SVMs) and other kernel methods have been introduced under the name of Core Vector Machines (CVMs). In this paper, we study a new algorithm to train SVMs based on an instance of the Frank-Wolfe optimization method recently proposed to approximate the solution of the MEB problem. We show that, specialized to SVM training, this algorithm can scale better than CVMs at the price of a slightly lower accuracy. © 2010 Springer-Verlag.
CITATION STYLE
Frandi, E., Gasparo, M. G., Lodi, S., Ñanculef, R., & Sartori, C. (2010). A new algorithm for training SVMs using approximate minimal enclosing balls. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6419 LNCS, pp. 87–95). https://doi.org/10.1007/978-3-642-16687-7_16
Mendeley helps you to discover research relevant for your work.