Towards Systematically Improvable Deep Learning Interatomic Potentials with E(3)-Equivariant Cluster Expansions
ORAL
Abstract
Message Passing Neural Networks (MPNNs) have emerged as the leading paradigm for modeling molecules and materials. While MPNNs have consistently shown remarkably low generalization errors, they inherently lack interpretability, are not systematically improvable, and are difficult to parallelize. In this talk, we discuss the Deep Interatomic Cluster Expansion (DICE), an equivariant neural network that learns many-body information without message passing or convolutions. DICE can be systematically improved through the inclusion of higher-order interactions, comes with physically meaningful hyperparameters, and is embarrassingly parallel. The method builds on a novel, learnable E(3)-equivariant many-body representation that utilizes weighted tensor products of geometric features to describe N-point interactions of atoms. The proposed many-body representation overcomes the combinatorial scaling of a complete cluster expansion and instead scales linearly with the number of simultaneously correlated atoms. We find that the use of higher-order correlations of atoms systematically improves the accuracy of the learned potential. Finally, we discuss transferability to out-of-distribution data, investigate the learned energy decompositions, and discuss theoretical connections to existing work.
–
Presenters
-
Albert Musaelian
Harvard University
Authors
-
Albert Musaelian
Harvard University
-
Simon L Batzner
Harvard University
-
Boris Kozinsky
Harvard University