Regularized coplanar discriminant analysis for dimensionality reduction

Citations of this article
Mendeley users who have this article in their library.


The dimensionality reduction methods based on linear embedding, such as neighborhood preserving embedding (NPE), sparsity preserving projections (SPP) and collaborative representation based projections (CRP), try to preserve a certain kind of linear representation for each sample after projection. However, in the transformed low-dimensional space, the linear relationship between the samples may be changed, which cannot make the linear representation-based classifiers, such as sparse representation-based classifier (SRC), to achieve higher recognition accuracy. In this paper, we propose a new linear dimensionality reduction algorithm, called Regularized Coplanar Discriminant Analysis (RCDA) to address this problem. It simultaneously seeks a linear projection matrix and some linear representation coefficients that make the samples from the same class coplanar and the samples from different classes not coplanar. The proposed regularization term balances the bias from the optimal linear representation and that from the class mean to avoid overfitting the training data, and overcomes the matrix singularity in solving the linear representation coefficients. An alternative optimization approach is proposed to solve the RCDA model. Experiments are done on several benchmark face databases and hyperspectral image databases, and results show that RCDA can obtain better performance than other dimensionality reduction methods.




Huang, K. K., Dai, D. Q., & Ren, C. X. (2017). Regularized coplanar discriminant analysis for dimensionality reduction. Pattern Recognition, 62, 87–98.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free