DartsReNet: Exploring New RNN Cells in ReNet Architectures

3Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present new Recurrent Neural Network (RNN) cells for image classification using a Neural Architecture Search (NAS) approach called DARTS. We are interested in the ReNet architecture, which is a RNN based approach presented as an alternative for convolutional and pooling steps. ReNet can be defined using any standard RNN cells, such as LSTM and GRU. One limitation is that standard RNN cells were designed for one dimensional sequential data and not for two dimensions like it is the case for image classification. We overcome this limitation by using DARTS to find new cell designs. We compare our results with ReNet that uses GRU and LSTM cells. Our found cells outperform the standard RNN cells on CIFAR-10 and SVHN. The improvements on SVHN indicate generalizability, as we derived the RNN cell designs from CIFAR-10 without performing a new cell search for SVHN. (The source code of our approach and experiments is available at https://github.com/LuckyOwl95/DartsReNet/.).

Author supplied keywords

Cite

CITATION STYLE

APA

Moser, B. B., Raue, F., Hees, J., & Dengel, A. (2020). DartsReNet: Exploring New RNN Cells in ReNet Architectures. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12396 LNCS, pp. 850–861). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-61609-0_67

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free