On the initialization of long short-term memory networks

5Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Weight initialization is important for faster convergence and stability of deep neural networks training. In this paper, a robust initialization method is developed to address the training instability in long short-term memory (LSTM) networks. It is based on a normalized random initialization of the network weights that aims at preserving the variance of the network input and output in the same range. The method is applied to standard LSTMs for univariate time series regression and to LSTMs robust to missing values for multivariate disease progression modeling. The results show that in all cases, the proposed initialization method outperforms the state-of-the-art initialization techniques in terms of training convergence and generalization performance of the obtained solution.

Cite

CITATION STYLE

APA

Mehdipour Ghazi, M., Nielsen, M., Pai, A., Modat, M., Cardoso, M. J., Ourselin, S., & Sørensen, L. (2019). On the initialization of long short-term memory networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11953 LNCS, pp. 275–286). Springer. https://doi.org/10.1007/978-3-030-36708-4_23

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free