APS Logo

Title: Σ-attention: A Machine-Learning Framework for Learning the Self-Energy Operator in Strongly Correlated Systems

ORAL

Abstract

We introduce Σ-attention, a machine-learning (ML) framework specifically designed to learn the self-energy operator in strongly correlated electronic systems. This operator-learning framework leverages a transformer-based architecture, commonly used in large language models, to establish a mapping between the noninteracting Green's function and the self-energy. Our results show that the transformer's attention mechanism effectively learns this mapping from training data of electronic systems with various sizes and interaction strengths. Consequently, we obtain a neural network representation of the self-energy that is universally valid across a broad range of interaction strengths. We show that the learned self-energy operator can effectively captures the Mott transition and yields quantitatively better predictions of electronic properties when compared to perturbative methods such as GW.

Presenters

  • Yuanran Zhu

    Lawrence Berkeley National Laboratory

Authors

  • Yuanran Zhu

    Lawrence Berkeley National Laboratory