Effect of Laplacian Smoothing Stochastic Gradient Descent with Angular Margin Softmax Loss on Face Recognition

2Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

An important task in deep learning for face recognition is to use proper loss functions and optimization technique. Several loss functions have been proposed using stochastic gradient descent for this task. The main purpose of this work is to propose the strategy to use the Laplacian smoothing stochastic gradient descent with combination of multiplicative angular margin to enhance the performance of angularly discriminative features of angular margin softmax loss for face recognition. The model is trained on a most popular face recognition dataset CASIA-WebFace and it achieves the state-of-the-art performance on several academic benchmark datasets such as Labeled Face in the Wild (LFW), YouTube Faces (YTF), VGGFace1 and VGGFace2. Our method achieves a new record accuracy of 99.54% on LFW dataset. On YTF dataset it achieves 95.53% accuracy.

Cite

CITATION STYLE

APA

Iqbal, M., Rehman, M. A., Iqbal, N., & Iqbal, Z. (2020). Effect of Laplacian Smoothing Stochastic Gradient Descent with Angular Margin Softmax Loss on Face Recognition. In Communications in Computer and Information Science (Vol. 1198, pp. 549–561). Springer. https://doi.org/10.1007/978-981-15-5232-8_47

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free