Projective nonnegative matrix factorization with α-divergence

N/ACitations
Citations of this article
23Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A new matrix factorization algorithm which combines two recently proposed nonnegative learning techniques is presented. Our new algorithm, α-PNMF, inherits the advantages of Projective Nonnegative Matrix Factorization (PNMF) for learning a highly orthogonal factor matrix. When the Kullback-Leibler (KL) divergence is generalized to α-divergence, it gives our method more flexibility in approximation. We provide multiplicative update rules for α-PNMF and present their convergence proof. The resulting algorithm is empirically verified to give a good solution by using a variety of real-world datasets. For feature extraction, α-PNMF is able to learn highly sparse and localized part-based representations of facial images. For clustering, the new method is also advantageous over Nonnegative Matrix Factorization with α-divergence and ordinary PNMF in terms of higher purity and smaller entropy. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Yang, Z., & Oja, E. (2009). Projective nonnegative matrix factorization with α-divergence. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5768 LNCS, pp. 20–29). https://doi.org/10.1007/978-3-642-04274-4_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free