On optimal low rank Tucker approximation for tensors: the case for an adjustable core size

10Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Approximating high order tensors by low Tucker-rank tensors have applications in psychometrics, chemometrics, computer vision, biomedical informatics, among others. Traditionally, solution methods for finding a low Tucker-rank approximation presume that the size of the core tensor is specified in advance, which may not be a realistic assumption in many applications. In this paper we propose a new computational model where the configuration and the size of the core become a part of the decisions to be optimized. Our approach is based on the so-called maximum block improvement method for non-convex block optimization. Numerical tests on various real data sets from gene expression analysis and image compression are reported, which show promising performances of the proposed algorithms.

Cite

CITATION STYLE

APA

Chen, B., Li, Z., & Zhang, S. (2015). On optimal low rank Tucker approximation for tensors: the case for an adjustable core size. Journal of Global Optimization, 62(4), 811–832. https://doi.org/10.1007/s10898-014-0231-x

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free