Due to the structural problem, the traditional neural network models are prone to problems such as gradient explosion and over-fitting, while the deep GRU neural network model has low update efficiency and poor information processing capability among multiple hidden layers. Based on this, this paper proposes an optimized gated recurrent unit(OGRU) neural network.The OGRU neural network model proposed in this paper improves information processing capability and learning efficiency by optimizing the unit structure and learning mechanism of GRU, and avoids the update gate being interfered by the current forgetting information. The experiment uses Tensorflow framework to establish prediction models for LSTM neural network, GRU neural network and OGRU neural network respectively, and compare the prediction accuracy. The results show that the OGRU model has the highest learning efficiency and better prediction accuracy.
CITATION STYLE
Wang, X., Xu, J., Shi, W., & Liu, J. (2019). OGRU: An Optimized Gated Recurrent Unit Neural Network. In Journal of Physics: Conference Series (Vol. 1325). Institute of Physics Publishing. https://doi.org/10.1088/1742-6596/1325/1/012089
Mendeley helps you to discover research relevant for your work.