Estimating the information potential with the fast gauss transform

7Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we propose a fast and accurate approximation to the information potential of Information Theoretic Learning (ITL) using the Fast Gauss Transform (FGT). We exemplify here the case of the Minimum Error Entropy criterion to train adaptive systems. The FGT reduces the complexity of the estimation from O(N2) to O(pkN) where p is the order of the Hermite approximation and k the number of clusters utilized in FGT. Further, we show that FGT converges to the actual entropy value rapidly with increasing order p unlike the Stochastic Information Gradient, the present O(pN) approximation to reduce the computational complexity in ITL. We test the performance of these FGT methods on System Identification with encouraging results. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Han, S., Rao, S., & Principe, J. (2006). Estimating the information potential with the fast gauss transform. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3889 LNCS, pp. 82–89). https://doi.org/10.1007/11679363_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free