Reducing the Number of Multiplications in Convolutional Recurrent Neural Networks (ConvRNNs)

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This is an extension from a selected paper from JSAI2019. Convolutional variants of recurrent neural networks, ConvRNNs, are widely used for spatio-temporal modeling, since they are well suited to model sequences of two-dimensional inputs. Similar to conventional RNNs, the introduction of gating architecture, such as ConvLSTM, brings additional parameters and increases the computational complexity. The computation load can be an obstacle in training efficient models and putting ConvRNNs in operation in real-world applications. However, the correspondence between ConvRNN unit complexity and its performance is not well investigated. We propose to reduce the number of parameters and multiplications by substituting some convolutional operations with the Hadamard product. We evaluate our proposal using the task of next video frame prediction and the Moving MNIST dataset. The proposed method requires 38% less multiplications and 21% less parameters compared to the fully convolutional counterpart. In price of the reduced computational complexity, the performance measured by structural similarity index measure (SSIM) decreased about 1.5%. ConvRNNs with reduced computations can be used in a wider range of situations like in web applications or embedded systems. This paper is an extension of a selected paper from JSAI2019 [10].

Cite

CITATION STYLE

APA

Vazhenina, D., & Kanemura, A. (2020). Reducing the Number of Multiplications in Convolutional Recurrent Neural Networks (ConvRNNs). In Advances in Intelligent Systems and Computing (Vol. 1128 AISC, pp. 45–52). Springer. https://doi.org/10.1007/978-3-030-39878-1_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free