OGRU: An Optimized Gated Recurrent Unit Neural Network

34Citations
Citations of this article
64Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Due to the structural problem, the traditional neural network models are prone to problems such as gradient explosion and over-fitting, while the deep GRU neural network model has low update efficiency and poor information processing capability among multiple hidden layers. Based on this, this paper proposes an optimized gated recurrent unit(OGRU) neural network.The OGRU neural network model proposed in this paper improves information processing capability and learning efficiency by optimizing the unit structure and learning mechanism of GRU, and avoids the update gate being interfered by the current forgetting information. The experiment uses Tensorflow framework to establish prediction models for LSTM neural network, GRU neural network and OGRU neural network respectively, and compare the prediction accuracy. The results show that the OGRU model has the highest learning efficiency and better prediction accuracy.

Cite

CITATION STYLE

APA

Wang, X., Xu, J., Shi, W., & Liu, J. (2019). OGRU: An Optimized Gated Recurrent Unit Neural Network. In Journal of Physics: Conference Series (Vol. 1325). Institute of Physics Publishing. https://doi.org/10.1088/1742-6596/1325/1/012089

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free