Information theoretic learning (ITL) was initiated in the late 1990s at CNEL [126]. It uses descriptors from information theory (entropy and divergences) estimated directly from the data to substitute the conventional statistical descriptors of variance and covariance. It can be used in the adaptation of linear or nonlinear filters and also in unsupervised and supervised machine learning applications. In this chapter, we introduce two commonly used differential entropies for data understanding and information theoretic measures (ITMs) for evaluations in abstaining classifications.
CITATION STYLE
He, R., Hu, B., Yuan, X., & Wang, L. (2014). Information measures. In SpringerBriefs in Computer Science (Vol. 0, pp. 13–44). Springer. https://doi.org/10.1007/978-3-319-07416-0_3
Mendeley helps you to discover research relevant for your work.