Abstract
Deep neural networks (DNNs) in the infinite width/channel limit have received much attention recently, as they provide a clear analytical window to deep learning via mappings to Gaussian Processes (GPs). Despite its theoretical appeal, this viewpoint lacks a crucial ingredient of deep learning in finite DNNs, laying at the heart of their success — feature learning. Here we consider DNNs trained with noisy gradient descent on a large training set and derive a self-consistent Gaussian Process theory accounting for strong finite-DNN and feature learning effects. Applying this to a toy model of a two-layer linear convolutional neural network (CNN) shows good agreement with experiments. We further identify, both analytically and numerically, a sharp transition between a feature learning regime and a lazy learning regime in this model. Strong finite-DNN effects are also derived for a non-linear two-layer fully connected network. We have numerical evidence demonstrating that the assumptions required for our theory hold true in more realistic settings (Myrtle5 CNN trained on CIFAR-10). Our self-consistent theory provides a rich and versatile analytical framework for studying strong finite-DNN effects, most notably - feature learning.
| Original language | English |
|---|---|
| Title of host publication | Advances in Neural Information Processing Systems 34 - 35th Conference on Neural Information Processing Systems, NeurIPS 2021 |
| Editors | Marc'Aurelio Ranzato, Alina Beygelzimer, Yann Dauphin, Percy S. Liang, Jenn Wortman Vaughan |
| Publisher | Neural information processing systems foundation |
| Pages | 21352-21364 |
| Number of pages | 13 |
| ISBN (Electronic) | 9781713845393 |
| State | Published - 2021 |
| Event | 35th Conference on Neural Information Processing Systems, NeurIPS 2021 - Virtual, Online Duration: 6 Dec 2021 → 14 Dec 2021 |
Publication series
| Name | Advances in Neural Information Processing Systems |
|---|---|
| Volume | 26 |
| ISSN (Print) | 1049-5258 |
Conference
| Conference | 35th Conference on Neural Information Processing Systems, NeurIPS 2021 |
|---|---|
| City | Virtual, Online |
| Period | 6/12/21 → 14/12/21 |
Bibliographical note
Publisher Copyright:© 2021 Neural information processing systems foundation. All rights reserved.