Optimizing Deep Network for Image Classification with Hyper Parameter Tuning

  • Gogoi M
  • et al.
N/ACitations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The deep network model comprises of several processing layers and deep learning techniques help us in representing data with diverse levels of abstraction. Based on the practical importance and the efficiency of machine learning, optimization of deep models are carried out relating to the objective functions and its parameters for a particular problem. The present work focuses on an empirical analysis of the performance of stochastic optimization methods with regard to hyperparameters for the deep Convolution Neural Network (CNN) and to understand the rate of convergence of the optimization methods in high dimensional parameter spaces. Experimentation has been carried out in deep CNN model with different optimization methods viz. SGD, AdaGard, AdaDelta and Adam. The empirical results are evaluated using benchmark CIFAR10 and CIFAR100 datasets. The optimal values of the hyperparameters obtained demonstrates that the optimizer Adam shows the best results compared to other methods viz. SGD, AdaGard, and AdaDelta over the considered datasets. Further, it is noted that classification accuracy can be increased by choosing the best optimization techniques with hyperparameter tuning to get the optimal configuration of the deep CNN model.

Cite

CITATION STYLE

APA

Gogoi, M., & Begum, S. A. (2019). Optimizing Deep Network for Image Classification with Hyper Parameter Tuning. International Journal of Engineering and Advanced Technology, 9(2), 2264–2269. https://doi.org/10.35940/ijeat.b3515.129219

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free