The performance bounds of learning machines based on exponentially strongly mixing sequences

24Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Generalization performance is the main purpose of machine learning theoretical research. It has been shown previously by Vapnik, Cucker and Smale that the empirical risks based on an i.i.d. sequence must uniformly converge on their expected risks for learning machines as the number of samples approaches infinity. In order to study the generalization performance of learning machines under the condition of dependent input sequences, this paper extends these results to the case where the i.i.d. sequence is replaced by exponentially strongly mixing sequence. We obtain the bound on the rate of uniform convergence for learning machines by using Bernstein's inequality for exponentially strongly mixing sequences, and establishing the bound on the rate of relative uniform convergence for learning machines based on exponentially strongly mixing sequence. In the end, we compare these bounds with previous results. © 2007 Elsevier Ltd. All rights reserved.

Cite

CITATION STYLE

APA

Zou, B., & Li, L. (2007). The performance bounds of learning machines based on exponentially strongly mixing sequences. Computers and Mathematics with Applications, 53(7), 1050–1058. https://doi.org/10.1016/j.camwa.2006.07.015

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free