APS Logo

A theory of weight distribution-constrained learning

ORAL

Abstract

Recent large-scale connectomics studies have provided precise insights into the excitatory/inhibitory identities of individual synapses, as well as the distribution of synaptic weights in the brain. Motivated by this, we developed a theory of learning in neural networks that incorporates both sign and distribution constraints. We found analytical solutions for both the capacity and generalization performance in perceptron, a basic feedforward model, and developed an SGD-based algorithm to find weights that satisfies these constraints. We further applied our results to the Hopfield network, a recurrent model, and demonstrated that heterogeneity in neural populations emerges from a global distribution constraint.  

Presenters

  • Weishun Zhong

    Massachusetts Institute of Technology

Authors

  • Weishun Zhong

    Massachusetts Institute of Technology

  • Ben Sorscher

    Stanford University

  • Daniel D Lee

    Cornell Tech

  • Haim I Sompolinsky

    The Hebrew University of Jerusalem and Harvard University, Hebrew University of Jerusalem, Center for Brain Science, Harvard Univer