Csiszár's Divergences for Non-negative Matrix Factorization: Family of New Algorithms

209Citations
Citations of this article
114Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper we discus a wide class of loss (cost) functions for non-negative matrix factorization (NMF) and derive several novel algorithms with improved efficiency and robustness to noise and out-liers. We review several approaches which allow us to obtain generalized forms of multiplicative NMF algorithms and unify some existing algorithms. We give also the flexible and relaxed form of the NMF algorithms to increase convergence speed and impose some desired constraints such as sparsity and smoothness of components. Moreover, the effects of various regularization terms and constraints are clearly shown. The scope of these results is vast since the proposed generalized divergence functions include quite large number of useful loss functions such as the squared Euclidean distance,Kulback-Leibler divergence, Itakura-Saito, Hellinger, Pearson's chi-square, and Neyman's chi-square distances, etc. We have applied successfully the developed algorithms to blind (or semi blind) source separation (BSS) where sources can be generally statistically dependent, however they satisfy some other conditions or additional constraints such as nonnegativity, sparsity and/or smoothness. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Cichocki, A., Zdunek, R., & Amari, S. I. (2006). Csiszár’s Divergences for Non-negative Matrix Factorization: Family of New Algorithms. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3889 LNCS, pp. 32–39). https://doi.org/10.1007/11679363_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free