An improved NN training scheme using two-stage LDA features for face recognition

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents a new approach based on a Two-Stage Linear Discriminant Analysis (Two-Stage LDA) and Conjugate Gradient Algorithms (CGAs) for face recognition. A Two-Stage LDA technique is proposed that utilises the null space of the sample covariance matrix as well as using the range space of the between-class scatter matrix to extract discriminant information. Classic Back Propagation (BP) is a widely used Neural Network (NN) training algorithm in many detectors and classifiers. However, it is both too slow for many practical problems and its performance is not satisfactory in many application areas, including face recognition. To overcome these problems, four CGA algorithms (Fletcher-Reeves CGA, Polak-Ribiere CGA, Powell-Beale CGA, scaled CGA) have been proposed, the utility of which we investigate here in combination with Two-Stage LDA features. To further improve the accuracy, a modified AdaBoost.M1 approach was employed, which combines results of several NN classifiers as a single strong classifier. Experiments are performed on the ORL, FERET and AR face databases. The results show that all of the proposed methods lead to increased recognition rates and shorter training times compared to the classic BP. © 2012 Springer-Verlag.

Cite

CITATION STYLE

APA

Bozorgtabar, B., & Goecke, R. (2012). An improved NN training scheme using two-stage LDA features for face recognition. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7667 LNCS, pp. 662–671). https://doi.org/10.1007/978-3-642-34500-5_78

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free