Incremental learning in the non-negative matrix factorization

10Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The non-negative matrix factorization (NMF) is capable of factorizing strictly positive data into strictly positive activations and base vectors. In its standard form, the input data must be presented as a batch of data. This means the NMF is only able to represent the input space contained in this batch of data whereas it is not able to adapt to changes afterwards. In this paper we propose a method to overcome this limitation and to enable the NMF to incrementally and continously adapt to new data. The proposed algorithm is able to cover the (possibly growing) input space without putting further constraints on the algorithm. We show that using our method the NMF is able to approximate the dimensionality of a dataset and therefore is capable to determine the required number of base vectors automatically. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Rebhan, S., Sharif, W., & Eggert, J. (2009). Incremental learning in the non-negative matrix factorization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5507 LNCS, pp. 960–969). https://doi.org/10.1007/978-3-642-03040-6_117

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free