We clarify the equivalence between second-order tensor principal component analysis and two-dimensional singular value decomposition. Furthermore, we show that the two-dimensional discrete cosine transform is a good approximation to two-dimensional singular value decomposition and classical principal component analysis. Moreover, for the practical computation in two-dimensional singular value decomposition, we introduce the marginal eigenvector method, which was proposed for image compression. To evaluate the performances of the marginal eigenvector method and two-dimensional discrete cosine transform for dimension reduction, we compute recognition rates for image patterns. The results show that the marginal eigenvector method and two-dimensional discrete cosine transform have almost the same recognition rates for images in six datasets.
CITATION STYLE
Itoh, H., Imiya, A., & Sakai, T. (2015). Low-dimensional tensor principle component analysis. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9256, pp. 715–726). Springer Verlag. https://doi.org/10.1007/978-3-319-23192-1_60
Mendeley helps you to discover research relevant for your work.