Sparse super symmetric tensor factorization

7Citations
Citations of this article
21Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In the paper we derive and discuss a wide class of algorithms for 3D Super-symmetric Nonnegative Tensor Factorization (SNTF) or nonnegative symmetric PARAFAC, and as a special case: Symmetric Nonnegative Matrix Factorization (SNMF) that have many potential applications, including multi-way clustering, feature extraction, multi- sensory or multi-dimensional data analysis, and nonnegative neural sparse coding. The main advantage of the derived algorithms is relatively low complexity, and in the case of multiplicative algorithms possibility for straightforward extension of the algorithms to L-order tensors factorization due to some nice symmetric property. We also propose to use a wide class of cost functions such as Squared Euclidean, Kullback Leibler I-divergence, Alpha divergence and Beta divergence. Preliminary experimental results confirm the validity and good performance of some of these algorithms, especially when the data have sparse representations. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Cichocki, A., Jankovic, M., Zdunek, R., & Amari, S. I. (2008). Sparse super symmetric tensor factorization. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4984 LNCS, pp. 781–790). https://doi.org/10.1007/978-3-540-69158-7_81

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free