Margins, kernels and non-linear smoothed perceptrons

ArXiv: 1505.04123
3Citations
Citations of this article
65Readers
Mendeley users who have this article in their library.

Abstract

2014 We focus on the problem of finding a non-linear classification function that lies in a Reproducing Kernel Hubert Space (RKHS) both from the primal point of view (finding a perfect separator when one exists) and the dual point of view (giving a certificate of non-existence), with special focus on generalizations of two classical schemes - the Perceptron (primal) and Von- Neumann (dual) algorithms. We cast our problem as one of maximizing the regularized normalized hard-margin (p) in an RKHS and rephrase it in terms of a Mahalanobis dot-productlsemi-norm associated with the kernel's (normalized and signed) Gram matrix. We derive an accelerated smoothed algorithm with a convergence rate of given n separable points, which is strikingly similar to the classical kernelized Perceptron algorithm whose rate is . When no such classifier exists, we prove a version of Gordan's separation theorem for RKHSs, and give a reinterpretation of negative margins. This allows us to give guarantees for a primal-dual algorithm that halts in min{ } iterations with a perfect separator in the RKHS if the primal is feasible or a dual 6-certificate of near-infeasibility.

Cite

CITATION STYLE

APA

Ramdas, A., & Peña, J. (2014). Margins, kernels and non-linear smoothed perceptrons. In 31st International Conference on Machine Learning, ICML 2014 (Vol. 1, pp. 415–424). International Machine Learning Society (IMLS).

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free