The key theorem of learning theory on uncertainty space

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Statistical Learning Theory is commonly regarded as a sound framework within which we handle a variety of learning problems in presence of small size data samples. However, since the theory is based on probability space, it hardly handles statistical learning problems on uncertainty space. In this paper, the Statistical Learning Theory on uncertainty space is investigated. The Khintchine law of large numbers on uncertainty space is proved. The definitions of empirical risk functional, expected risk functional and empirical risk minimization principle on uncertainty space are introduced. On the basis of these concepts, the key theorem of learning theory on uncertainty space is introduced and proved. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Yan, S., Ha, M., Zhang, X., & Wang, C. (2009). The key theorem of learning theory on uncertainty space. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5551 LNCS, pp. 699–706). https://doi.org/10.1007/978-3-642-01507-6_79

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free