Maximally Correlated Principal Component Analysis Based on Deep Parameterization Learning

11Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Dimensionality reduction is widely used to deal with high-dimensional data. As a famous dimensionality reduction method, principal component analysis (PCA) aiming at finding the low dimension feature of original data has made great successes, and many improved PCA algorithms have been proposed. However, most algorithms based on PCA only consider the linear correlation of data features. In this article, we propose a novel dimensionality reduction model called maximally correlated PCA based on deep parameterization learning (MCPCADP), which takes nonlinear correlation into account in the deep parameterization framework for the purpose of dimensionality reduction. The new model explores nonlinear correlation by maximizing Ky-Fan norm of the covariance matrix of nonlinearly mapped data features. A new BP algorithm for model optimization is derived. In order to assess the proposed method, we conduct experiments on both a synthetic database and several real-world databases. The experimental results demonstrate that the proposed algorithm is comparable to several widely used algorithms.

Cite

CITATION STYLE

APA

Chen, H., Li, J., Gao, J., Sun, Y., Hu, Y., & Yin, B. (2019). Maximally Correlated Principal Component Analysis Based on Deep Parameterization Learning. ACM Transactions on Knowledge Discovery from Data, 13(4). https://doi.org/10.1145/3332183

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free