Probabilistic Symmetry, Variable Exchangeability, and Deep Network Learning Invariance and Equivariance
ORAL
Abstract
This talk will describe a mathematical-statistics framework for representing, modeling, and utilizing invariance and equivariance properties of deep neural networks. By drawing direct parallels between topological characterizations of invariance and equivariance principles, probabilistic symmetry, and statistical inference, we explore the foundational properties underpinning reliability in deep learning models. We examine the group-theoretic invariance in a number of deep neural networks including, multilayer perceptrons, convolutional networks, transformers, variational autoencoders, and steerable neural networks. Some biomedical and imaging applications are discussed at the end. Understanding the theoretical foundation underpinning deep neural network invariance is critical for reliable estimation of prior-predictive distributions, accurate calculations of posterior inference, and consistent AI prediction, classification, and forecasting.
–
Presenters
-
Yueyang Shen
University of Michigan
Authors
-
Ivo D Dinov
University of Michigan
-
Yueyang Shen
University of Michigan
-
Yupeng Zhang
University of Wisconsin