Information measures

1Citations
Citations of this article
44Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Information theoretic learning (ITL) was initiated in the late 1990s at CNEL [126]. It uses descriptors from information theory (entropy and divergences) estimated directly from the data to substitute the conventional statistical descriptors of variance and covariance. It can be used in the adaptation of linear or nonlinear filters and also in unsupervised and supervised machine learning applications. In this chapter, we introduce two commonly used differential entropies for data understanding and information theoretic measures (ITMs) for evaluations in abstaining classifications.

Cite

CITATION STYLE

APA

He, R., Hu, B., Yuan, X., & Wang, L. (2014). Information measures. In SpringerBriefs in Computer Science (Vol. 0, pp. 13–44). Springer. https://doi.org/10.1007/978-3-319-07416-0_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free