On the best low multilinear rank approximation of higher-order tensors

4Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper deals with the best low multilinear rank approximation of higher-order tensors. Given a tensor, we are looking for another tensor, as close as possible to the given one and with bounded multilinear rank. Higher-order tensors are used in higher-order statistics, signal processing, telecommunications and many other fields. In particular, the best low multilinear rank approximation is used as a tool for dimensionality reduction and signal subspace estimation. Computing the best low multilinear rank approximation is a nontrivial task. Higher-order generalizations of the singular value decomposition lead to suboptimal solutions. The higher-order orthogonal iteration is a widely used linearly convergent algorithm for further refinement. We aim for conceptually faster algorithms. However, applying standard optimization algorithms directly is not a good idea since there are infinitely many equivalent solutions. Nice convergence properties are observed when the solutions are isolated. The present invariance can be removed by working on quotient manifolds. We discuss three algorithms, based on Newton's method, the trust-region scheme and conjugate gradients. We also comment on the local minima of the problem. © 2010 Springer -Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Ishteva, M., Absil, P. A., Van Huffel, S., & De Lathauwer, L. (2010). On the best low multilinear rank approximation of higher-order tensors. In Recent Advances in Optimization and its Applications in Engineering (pp. 145–164). Springer Berlin Heidelberg. https://doi.org/10.1007/978-3-642-12598-0_13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free