Statistical Mechanics of Deep Learning

Yasaman Bahri, Jonathan Kadmon, Jeffrey Pennington, Sam S. Schoenholz, Jascha Sohl-Dickstein, Surya Ganguli

Research output: Contribution to journalReview articlepeer-review

157 Scopus citations

Abstract

The recent striking success of deep neural networks in machine learning raises profound questions about the theoretical principles underlying their success. For example, what can such deep networks compute? How can we train them? How does information propagate through them? Why can they generalize? And how can we teach them to imagine? We review recent work in which methods of physical analysis rooted in statistical mechanics have begun to provide conceptual insights into these questions. These insights yield connections between deep learning and diverse physical and mathematical topics, including random landscapes, spin glasses, jamming, dynamical phase transitions, chaos, Riemannian geometry, random matrix theory, free probability, and nonequilibrium statistical mechanics. Indeed, the fields of statistical mechanics and machine learning have long enjoyed a rich history of strongly coupled interactions, and recent advances at the intersection of statistical mechanics and deep learning suggest these interactions will only deepen going forward.

Original languageEnglish
Pages (from-to)501-528
Number of pages28
JournalAnnual Review of Condensed Matter Physics
Volume11
DOIs
StatePublished - 10 Mar 2020
Externally publishedYes

Bibliographical note

Publisher Copyright:
Copyright © 2020 by Annual Reviews. All rights reserved.

Keywords

  • Chaos
  • Dynamical phase transitions
  • Interacting particle systems
  • Jamming
  • Machine learning
  • Neural networks
  • Nonequilibrium statistical mechanics
  • Random matrix theory
  • Spin glasses

Fingerprint

Dive into the research topics of 'Statistical Mechanics of Deep Learning'. Together they form a unique fingerprint.

Cite this