Abstract
Mutual information (MI) is a common criterion in independent component analysis (ICA) optimization. MI is derived from probability density functions (PDF). There are scenarios in which assuming a parametric form for the PDF leads to poor performance. Therefore, the need arises for non-parametric PDF and MI estimation. Existing non-parametric algorithms suffer from high complexity, particularly in high dimensions. To counter this obstacle, we present an ICA algorithm based on accelerated kernel entropy estimation. It achieves both high separation performance and low computational complexity. For K sources with N samples, our ICA algorithm has an iteration complexity of at most O(KN log N + K2N). © Springer-Verlag 2004.
Cite
CITATION STYLE
Shwartz, S., Zibulevsky, M., & Schechner, Y. Y. (2004). ICA Using kernel entropy estimation with NlogN complexity. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3195, 422–429. https://doi.org/10.1007/978-3-540-30110-3_54
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.