Renormalization-group flow in neural-network priors
Invited
Abstract
Gaussian processes are ubiquitous in nature and engineering. A case in point is a class of neural networks in the infinite-width limit, whose priors correspond to Gaussian processes. In this talk I extend this correspondence to real neural networks with finite widths, yielding non-Gaussian processes as priors. On the way we shall encounter recursive equations that relate distributions of neural activities from lower to higher layers, reminiscent of renormalization-group flow.
–
Presenters
-
Sho Yaida
Facebook
Authors
-
Sho Yaida
Facebook