Learning curves for overparametrized deep neural networks: A field theory perspective

Omry Cohen, Or Malka, Zohar Ringel

Research output: Contribution to journalArticlepeer-review

20 Scopus citations

Abstract

In the past decade, deep neural networks (DNNs) came to the fore as the leading machine-learning algorithms for a variety of tasks. Their rise was founded on market needs and engineering craftsmanship, the latter based more on trial and error than on theory. While still far behind the application forefront, the theoretical study of DNNs has recently made important advancements in analyzing the highly overparametrized regime where some exact results have been obtained. Leveraging these ideas and adopting a more physicslike approach, here we construct a versatile field theory formalism for supervised deep learning, involving renormalization group, Feynman diagrams, and replicas. In particular, we show that our approach leads to highly accurate predictions of learning curves of truly deep DNNs trained on polynomial regression problems. It also explains in a concrete manner why DNNs generalize well despite being highly overparametrized, this due to an entropic bias to simple functions which, for the case of fully connected DNNs with data sampled on the hypersphere, are low-order polynomials in the input vector. Being a complex interacting system of artificial neurons, we believe that such tools and methodologies borrowed from condensed matter physics would prove essential for obtaining an accurate quantitative understanding of deep learning.

Original languageAmerican English
Article number023034
JournalPhysical Review Research
Volume3
Issue number2
DOIs
StatePublished - 9 Apr 2021

Bibliographical note

Publisher Copyright:
© 2021 authors. Published by the American Physical Society.

Fingerprint

Dive into the research topics of 'Learning curves for overparametrized deep neural networks: A field theory perspective'. Together they form a unique fingerprint.

Cite this