Matrix factorizations and tensor decompositions are now widely used in machine learning and data mining. They decompose input matrix and tensor data into matrix factors by optimizing a least square objective function using iterative updating algorithms, e.g. HOSVD (High Order Singular Value Decomposition) and ParaFac (Parallel Factors). One fundamental problem of these algorithms remains unsolved: are the solutions found by these algorithms global optimal? Surprisingly, we provide a positive answer for HSOVD and negative answer for ParaFac by combining theoretical analysis and experimental evidence. Our discoveries of this intrinsic property of HOSVD assure us that in real world applications HOSVD provides repeatable and reliable results. © 2011 Springer-Verlag.
CITATION STYLE
Luo, D., Ding, C., & Huang, H. (2011). Are tensor decomposition solutions unique? On the global convergence HOSVD and ParaFac algorithms. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6634 LNAI, pp. 148–159). Springer Verlag. https://doi.org/10.1007/978-3-642-20841-6_13
Mendeley helps you to discover research relevant for your work.