Using Machine Learning to Discover Theories of Everything
ORAL
Abstract
The great difficulty with a ``theory of everything'' is that it needs to model complex, non-linear relationships between variables. I will present the nuts and bolts of a machine learning framework that uses similarity kernels to transform the non-linear problem into a tractable, linear one. Inasmuch as the method relies crucially on mathematical representations, we investigate it by example: predicting the properties of all possible materials. The key idea is to construct a continuous, smooth, differentiable representation with appropriate invariances under available symmetries.
–
Authors
-
Conrad W. Rosenbrock
Brigham Young University
-
Gus L. W. Hart
Brigham Young University, Brigham Young Univ - Provo, Brigham Young University - Provo