Effect of Dropout and Batch Normalization in Siamese Network for Face Recognition

5Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The paper focuses on maximizing feature extraction and classification using one-shot learning (meta-learning). The present work discusses how to maximize the performance of the Siamese Neural Network using various regularization and normalization techniques for very low epochs. In this paper we perform multi-class Face Recognition. A unique pairing of face images helps us to understand the generalization capacity of our network which is scrutinized on AT&T-ORL face databases. We performed experiments to see how learning can be made to converge within a few epochs, and the approach has also made a telling performance on unseen test data which is about 96.01%. Besides, we discuss the ways to speed up learning particularly for a Siamese network and achieve convergence within 5 epochs. We found one of the better regularization techniques for fast reduction of the loss function. It is apparent from our findings that only normalization is the effective approach while working within less epochs. Also, Dropout After Batch Normalization configuration results in smooth loss reduction.

Cite

CITATION STYLE

APA

Chakraborty, N., Dan, A., Chakraborty, A., & Neogy, S. (2020). Effect of Dropout and Batch Normalization in Siamese Network for Face Recognition. In Advances in Intelligent Systems and Computing (Vol. 1059, pp. 21–37). Springer. https://doi.org/10.1007/978-981-15-0324-5_3

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free