Analytic characterization of the Hessian in shallow ReLU models: A tale of symmetry

Yossi Arjevani, Michael Field

Research output: Contribution to journalConference articlepeer-review

5 Scopus citations


We consider the optimization problem associated with fitting two-layers ReLU networks with respect to the squared loss, where labels are generated by a target network. We leverage the rich symmetry structure to analytically characterize the Hessian at various families of spurious minima in the natural regime where the number of inputs d and the number of hidden neurons k is finite. In particular, we prove that for d = k standard Gaussian inputs: (a) of the dk eigenvalues of the Hessian, dk - O(d) concentrate near zero, (b) ?(d) of the eigenvalues grow linearly with k. Although this phenomenon of extremely skewed spectrum has been observed many times before, to our knowledge, this is the first time it has been established rigorously. Our analytic approach uses techniques, new to the field, from symmetry breaking and representation theory, and carries important implications for our ability to argue about statistical generalization through local curvature.

Original languageAmerican English
JournalAdvances in Neural Information Processing Systems
StatePublished - 2020
Externally publishedYes
Event34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online
Duration: 6 Dec 202012 Dec 2020

Bibliographical note

Publisher Copyright:
© 2020 Neural information processing systems foundation. All rights reserved.


Dive into the research topics of 'Analytic characterization of the Hessian in shallow ReLU models: A tale of symmetry'. Together they form a unique fingerprint.

Cite this