Learning from non-iid data: Fast rates for the one-vs-all multiclass plug-in classifiers

11Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We prove new fast learning rates for the one-vs-all multiclass plug-in classifiers trained either from exponentially strongly mixing data or from data generated by a converging drifting distribution. These are two typical scenarios where training data are not iid. The learning rates are obtained under a multiclass version of Tsybakov’s margin assumption, a type of low-noise assumption, and do not depend on the number of classes. Our results are general and include a previous result for binaryclass plug-in classifiers with iid data as a special case. In contrast to previous works for least squares SVMs under the binary-class setting, our results retain the optimal learning rate in the iid case.

Cite

CITATION STYLE

APA

Dinh, V., Ho, L. S. T., Cuong, N. V., Nguyen, D., & Nguyen, B. T. (2015). Learning from non-iid data: Fast rates for the one-vs-all multiclass plug-in classifiers. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9076, pp. 375–387). Springer Verlag. https://doi.org/10.1007/978-3-319-17142-5_32

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free