Analysis of recurrent neural network and predictions

27Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.

Abstract

This paper analyzes the operation principle and predicted value of the recurrent-neuralnetwork (RNN) structure, which is the most basic and suitable for the change of time in the structure of a neural network for various types of artificial intelligence (AI). In particular, an RNN in which all connections are symmetric guarantees that it will converge. The operating principle of a RNN is based on linear data combinations and is composed through the synthesis of nonlinear activation functions. Linear combined data are similar to the autoregressive-moving average (ARMA) method of statistical processing. However, distortion due to the nonlinear activation function in RNNs causes the predicted value to be different from the predicted ARMA value. Through this, we know the limit of the predicted value of an RNN and the range of prediction that changes according to the learning data. In addition to mathematical proofs, numerical experiments confirmed our claims.

Cite

CITATION STYLE

APA

Park, J., Yi, D., & Ji, S. (2020). Analysis of recurrent neural network and predictions. Symmetry, 12(4), 615. https://doi.org/10.3390/SYM12040615

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free