A Unit Softmax with Laplacian Smoothing Stochastic Gradient Descent for Deep Convolutional Neural Networks

3Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Several techniques were designed during last few years to improve the performance of deep architecture by means of appropriate loss functions or activation functions. Arguably, softmax is the traditionally convenient to train Deep Convolutional Neural Networks (DCNNs) for classification task. However, the modern deep learning architectures have exposed its limitation towards feature discriminability. In this paper, we offered a supervision signal for discriminative image features through a modification in softmax to boost up the power of loss function. Amending the original softmax loss and motivated by the A-softmax loss for face recognition, we fixed the angular margin to introduce a unit margin softmax loss. The improved alternative form of softmax is trainable, easy to optimize and stable for usage along with Stochastic Gradient Descent (SGD) and Laplacian Smoothing Stochastic Gradient Descent (LS-SGD) and applicable to classify the digits in image. Experimental results demonstrate a state-of-the-art performance on famous database of handwritten digits the Modified National Institute of Standards and Technology (MNIST) database.

Cite

CITATION STYLE

APA

Ul Rahman, J., Ali, A., Ur Rehman, M., & Kazmi, R. (2020). A Unit Softmax with Laplacian Smoothing Stochastic Gradient Descent for Deep Convolutional Neural Networks. In Communications in Computer and Information Science (Vol. 1198, pp. 162–174). Springer. https://doi.org/10.1007/978-981-15-5232-8_14

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free