A second-order Perceptron algorithm

14Citations
Citations of this article
30Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We introduce a variant of the Perceptron algorithm called second-order Perceptron algorithm, which is able to exploit certain spectral properties of the data. We analyze the second-order Perceptron algorithm in the mistake bound model of on-line learning and prove bounds in terms of the eigenvalues of the Gram matrix created from the data. The performance of the second-order Perceptron algorithm is affected by the setting of a parameter controlling the sensitivity to the distribution of the eigenvalues of the Gram matrix. Since this information is not preliminarly available to on-line algorithms, we also design a refined version of the second-order Perceptron algorithm which adaptively sets the value of this parameter. For this second algorithm we are able to prove mistake bounds corresponding to a nearly optimal constant setting of the parameter.

Cite

CITATION STYLE

APA

Cesa-Bianchi, N., Conconi, A., & Gentile, C. (2002). A second-order Perceptron algorithm. In Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) (Vol. 2375, pp. 121–137). Springer Verlag. https://doi.org/10.1007/3-540-45435-7_9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free