Building Deep Learning Architectures for Physics, Chemistry, and Biology with Geometric Algebra
ORAL
Abstract
Designing deep learning architectures which respect the structure underlying problems of interest can both improve the data efficiency of model training and impose symmetries that are vital for many applications. For physical problems in real space, we would often like to formulate machine learning models which incorporate not only geometric information, but also point-level signals such as the identity of coarse-grained beads, atoms, or protein residues. In this talk, we build permutation- and rotation-equivariant neural network layers for learning on point clouds using geometric algebra and attention mechanisms. We demonstrate the flexibility of these architectures by solving example problems spanning domains from basic physics to biology.
–
Publication: https://arxiv.org/abs/2110.02393
Presenters
-
Matthew P Spellings
Vector Institute for Artificial Intellig
Authors
-
Matthew P Spellings
Vector Institute for Artificial Intellig