The generalization performance is the main purpose of machine learning theoretical research. The previous bounds describing the generalization ability of Tikhonov regularization algorithm are almost all based on independent and identically distributed (i.i.d.) samples. In this paper we go far beyond this classical framework by establishing the bound on the generalization ability of Tikhonov regularization algorithm with exponentially strongly mixing observations. We then show that Tikhonov regularization algorithm with exponentially strongly mixing observations is consistent. © 2009 Springer Berlin Heidelberg.
CITATION STYLE
Xu, J., & Zou, B. (2009). Learning performance of tikhonov regularization algorithm with strongly mixing samples. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5551 LNCS, pp. 717–727). https://doi.org/10.1007/978-3-642-01507-6_81
Mendeley helps you to discover research relevant for your work.