APS Logo

Equivariant Neural Networks for Particle Physics: PELICAN

ORAL · Invited

Abstract

We hold these truths to be self-evident: that all physics problems are created unequal, that they are endowed with their unique data structures and symmetries, that among these are tensor transformation laws, Lorentz symmetry, and permutation equivariance. A lot of attention has been paid to the applications of common machine learning methods in physics experiments and theory. However, much less attention is paid to the methods themselves and their viability as physics modeling tools. One of the most fundamental aspects of modeling physical phenomena is the identification of the symmetries that govern them. Incorporating symmetries into a model can reduce the risk of over-parameterization, and consequently improve a model's robustness and predictive power. Building off of previous work, I will demonstrate how careful choices in the details of network design – creating a model both simpler and more grounded in physics than the traditional approaches – can yield state-of-the-art performance despite the symmetry constraints. I will describe the Permutation-Equivariant and Lorentz-Invariant or Covariant Aggregator Network (PELICAN), which is based on a fusion of classical ideas from Invariant Theory and recent work on permutation-equivariant maps. As a proof of concept, it is applied to tagging and novel reconstruction (regression) problems. Particular attention will be paid to the remarkable explainability features of this kind of architecture, made possible only by the implementation of both permutation and full Lorentz symmetries. In particular, constituent-based regression with PELICAN results in particle-level features that can be visualized and interpreted in ways impossible with any traditional architecture.

Presenters

  • Alexander Bogatskiy

    Flatiron Institute

Authors

  • Alexander Bogatskiy

    Flatiron Institute