Wilsonian renormalization of neural network Gaussian processes

Jessica N. Howard*, Ro Jefferson, Anindita Maiti, Zohar Ringel

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Separating relevant and irrelevant information is key to any modeling process or scientific inquiry. Theoretical physics offers a powerful tool for achieving this in the form of the renormalization group (RG). Here we demonstrate a practical approach to performing Wilsonian RG in the context of Gaussian Process (GP) Regression. We systematically integrate out the unlearnable modes of the GP kernel, thereby obtaining an RG flow of the GP in which the data sets the IR scale. In simple cases, this results in a universal flow of the ridge parameter, which becomes input-dependent in the richer scenario in which non-Gaussianities are included. In addition to being analytically tractable, this approach goes beyond structural analogies between RG and neural networks by providing a natural connection between RG flow and learnable vs. unlearnable modes. Studying such flows may improve our understanding of feature learning in deep neural networks, and enable us to identify potential universality classes in these models.

Original languageEnglish
Article number025038
JournalMachine Learning: Science and Technology
Volume6
Issue number2
DOIs
StatePublished - 30 Jun 2025

Bibliographical note

Publisher Copyright:
© 2025 The Author(s). Published by IOP Publishing Ltd.

Keywords

  • feature learning in deep neural networks
  • Gaussian process regression
  • neural scaling laws
  • theory of neural networks
  • Wilsonian renormalization

Fingerprint

Dive into the research topics of 'Wilsonian renormalization of neural network Gaussian processes'. Together they form a unique fingerprint.

Cite this