Minimax Mutual Information approach for ICA of complex-valued linear mixtures

2Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recently, the authors developed the Minimax Mutual Information algorithm for linear ICA of real-valued mixtures, which is based on a density estimate stemming from Jaynes' maximum entropy principle. Since the entropy estimates result in an approximate upper bound for the actual mutual information of the separated outputs, minimizing this upper bound results in a robust performance and good generalization. In this paper, we extend the mentioned algorithm to complex-valued mixtures. Simulations with artificial data demonstrate that the proposed algorithm outperforms FastICA. © Springer-Verlag 2004.

Cite

CITATION STYLE

APA

Xu, J. W., Erdogmus, D., Rao, Y. N., & Príncipe, J. C. (2004). Minimax Mutual Information approach for ICA of complex-valued linear mixtures. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3195, 311–318. https://doi.org/10.1007/978-3-540-30110-3_40

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free