Using Neural Boltzmann Machines for designing proteins with a desired function
ORAL
Abstract
Generative models play a crucial role in machine learning by capturing the underlying structure of data, enabling tasks like data generation and prediction. Restricted Boltzmann Machines (RBMs) are a class of energy-based neural networks that model data by learning a mapping between visible and hidden units. However, traditional RBMs face limitations when dealing with complex or conditional data. To address this, Conditional Restricted Boltzmann Machines (CRBMs) incorporate external information to guide the generative process. Despite these advances, there is a need for models that can handle complex conditional relationships and still be trained using extremely small datasets. For example, in the context of protein design, one is often interested in generating novel protein sequences conditioned on a function of interest. In this talk, we will present our progress on using Neural Boltzmann Machines (NBMs) to tackle this problem. NBMs generalize CRBMs by replacing RBM parameters with neural networks. We compare the ability of NMBs and RBMs to generate novel sequences with high predicted catalytic activity using a recently published dataset on the chorismate mutase enzyme. We conclude by discussing the potential of energy-based models to learn complex sequence-to-function mappings.
–
Presenters
-
Huan Souza
Boston University
Authors
-
Huan Souza
Boston University
-
Pankaj Mehta
Boston University