Using Neural Architecture Search to Optimize Neural Networks for Embedded Devices

0Citations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recent advances in the field of Neural Architecture Search (NAS) have made it possible to develop state-of-the-art deep learning systems without requiring extensive human expertise and hyperparameter tuning. In most previous research, little concern was given to the resources required to run the generated systems. In this paper, we present an improvement on a recent NAS method, Efficient Neural Architecture Search (ENAS). We adapt ENAS to not only take into account the network’s performance, but also various constraints that would allow these networks to be ported to embedded devices. Our results show ENAS’ ability to comply with these added constraints. In order to show the efficacy of our system, we demonstrate it by designing a Recurrent Neural Network (RNN) that predicts words as they are spoken, and meets the constraints set out for operation on an embedded device.

Cite

CITATION STYLE

APA

Cassimon, T., Vanneste, S., Bosmans, S., Mercelis, S., & Hellinckx, P. (2020). Using Neural Architecture Search to Optimize Neural Networks for Embedded Devices. In Lecture Notes in Networks and Systems (Vol. 96, pp. 684–693). Springer. https://doi.org/10.1007/978-3-030-33509-0_64

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free