Learning molecular dynamics with simple language model built upon long short-term memory neural network
ORAL
Abstract
Recurrent neural networks (RNNs) have led to breakthroughs in natural language processing and speech recognition. Here we show that RNNs, specifically long short-term memory (LSTM) neural networks can also capture the temporal evolution of chemical/biophysical trajectories. Our language model learns a probabilistic model of 1-dimensional stochastic trajectories generated from higher-dimensional molecular dynamics. The model captures the Boltzmann statistics of the system and also reproduces kinetics across a large spectrum of timescales. We demonstrate how training the LSTM is equivalent to learning a path entropy, and that the LSTM embedding layer, instead of representing contextual meaning of characters, here exhibits a nontrivial connectivity between different metastable states in the underlying physical system. We also demonstrate our model's reliability through different benchmark systems and a single molecule force spectroscopy trajectory for multi-state riboswitch.
–
Presenters
-
SUN-TING TSAI
University of Maryland, College Park
Authors
-
SUN-TING TSAI
University of Maryland, College Park
-
Pratyush Tiwary
University of Maryland, University of Maryland, College Park