Learning molecular dynamics with simple language model built upon long short-term memory neural network

85Citations
Citations of this article
205Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Recurrent neural networks have led to breakthroughs in natural language processing and speech recognition. Here we show that recurrent networks, specifically long short-term memory networks can also capture the temporal evolution of chemical/biophysical trajectories. Our character-level language model learns a probabilistic model of 1-dimensional stochastic trajectories generated from higher-dimensional dynamics. The model captures Boltzmann statistics and also reproduces kinetics across a spectrum of timescales. We demonstrate how training the long short-term memory network is equivalent to learning a path entropy, and that its embedding layer, instead of representing contextual meaning of characters, here exhibits a nontrivial connectivity between different metastable states in the underlying physical system. We demonstrate our model’s reliability through different benchmark systems and a force spectroscopy trajectory for multi-state riboswitch. We anticipate that our work represents a stepping stone in the understanding and use of recurrent neural networks for understanding the dynamics of complex stochastic molecular systems.

Cite

CITATION STYLE

APA

Tsai, S. T., Kuo, E. J., & Tiwary, P. (2020). Learning molecular dynamics with simple language model built upon long short-term memory neural network. Nature Communications, 11(1). https://doi.org/10.1038/s41467-020-18959-8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free