Integration of Neural Network-Based Symbolic Regression in Deep Learning for Scientific Discovery
ORAL
Abstract
Symbolic regression is a powerful technique that can discover the underlying analytical equations describing data, which can lead to explainable models and generalizability outside of the training data set. Here we use a neural network for symbolic regression based on the EQL network and integrate it into other deep learning architectures such that the whole system can be trained end-to-end through backpropagation. We demonstrate this system on an arithmetic task involving MNIST digits and on prediction of dynamical systems. The architecture is able to simultaneously extract meaningful latent variables and find the underlying equations that generalize extremely well outside of the training data set compared to a standard neural network approaches, paving the way for scientific discovery.
–
Presenters
-
Samuel Kim
Electrical Engineering and Computer Science, Massachusetts Institute of Technology
Authors
-
Samuel Kim
Electrical Engineering and Computer Science, Massachusetts Institute of Technology
-
Peter Lu
Physics, Massachusetts Institute of Technology, Department of Physics, Massachusetts Institute of Technology
-
Michael Gilbert
Electrical Engineering and Computer Science, Massachusetts Institute of Technology
-
Srijon Mukherjee
Physics, Massachusetts Institute of Technology
-
Li Jing
Physics, Massachusetts Institute of Technology
-
Vladimir Čeperić
University of Zagreb
-
Marin Soljacic
Physics, Massachusetts Institute of Technology, Department of Physics, Massachusetts Institute of Technology