An Empirical Exploration of Deep Recurrent Connections Using Neuro-Evolution

6Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Neuro-evolution and neural architecture search algorithms have gained significant interest due to the challenges of designing optimal artificial neural networks (ANNs). While these algorithms possess the potential to outperform the best human crafted architectures, a less common use of them is as a tool for analysis of ANN topologies and structural components. By performing these techniques while varying the allowable components, the best performing architectures for those components can be found and compared to best performing architectures for other components, allowing for a best case comparison of component capabilities – a more rigorous examination than simply applying those components in some standard fixed topologies. In this work, we utilize the Evolutionary eXploration of Augmenting Memory Models (EXAMM) algorithm to perform a rigorous examination and comparison of recurrent neural networks (RNNs) applied to time series prediction. Specifically, EXAMM is used to investigate the capabilities of recurrent memory cells as well as various complex recurrent connectivity patterns that span one or more steps in time, i.e., deep recurrent connections. Over 10.56 million RNNs were evolved and trained in 5, 280 repeated experiments with varying components. Many modern hand-crafted RNNs rely on complex memory cells (which have internal recurrent connections that only span a single time step) operating under the assumption that these sufficiently latch information and handle long term dependencies. However, our results show that networks evolved with deep recurrent connections perform significantly better than those without. More importantly, in some cases, the best performing RNNs consisted of only simple neurons and deep time skip connections, without any memory cells. These results strongly suggest that utilizing deep time skip connections in RNNs for time series data prediction not only deserves further, dedicated study, but also demonstrate the potential of neuro-evolution as a means to better study, understand, and train effective RNNs.

Cite

CITATION STYLE

APA

Desell, T., ElSaid, A. E. R., & Ororbia, A. G. (2020). An Empirical Exploration of Deep Recurrent Connections Using Neuro-Evolution. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12104 LNCS, pp. 546–561). Springer. https://doi.org/10.1007/978-3-030-43722-0_35

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free