Refactoring of Neural Network Models for Hyperparameter Optimization in Serverless Cloud

1Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Machine Learning and Neural Networks in particular have become hot topics in Computer Science. The recent 2019 Turing award to the forefathers of Deep Learning and AI - Yoshua Bengio, Geoffrey Hinton, and Yann LeCun proves the importance of the technology and its effect on science and industry. However, we have realized that even nowadays, the state of the art methods require several manual steps for neural network hyperparameter optimization. Our approach automates the model tuning by refactoring the original Python code using open-source libraries for processing. We were able to identify hyperparameters by parsing the original source and analyzing it. Given these parameters, we refactor the model, add the state of the art optimization library calls, and run the updated code in the Serverless Cloud. Our approach has proven to eliminate manual steps for an arbitrary TensorFlow and Keras tuning. We have created a tool called OptPar which automatically refactors an arbitrary Deep Neural Network optimizing its hyperparameters. Such a transformation can save hours of time for Data Scientists, giving them an opportunity to concentrate on designing their Machine Learning algorithms.

Cite

CITATION STYLE

APA

Kaplunovich, A., & Yesha, Y. (2020). Refactoring of Neural Network Models for Hyperparameter Optimization in Serverless Cloud. In Proceedings - 2020 IEEE/ACM 42nd International Conference on Software Engineering Workshops, ICSEW 2020 (pp. 311–314). Association for Computing Machinery, Inc. https://doi.org/10.1145/3387940.3392268

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free