Abstract
Statistical-physics calculations in machine learning and theoretical neuroscience often involve lengthy derivations that obscure physical interpretation. Here, we give concise, non-replica derivations of several key results and highlight their underlying similarities. In particular, using a cavity approach, we analyze three high-dimensional learning problems: perceptron classification of points, perceptron classification of manifolds, and kernel ridge regression. These problems share a common structure—a bipartite system of interacting feature and datum variables—enabling a unified analysis. Furthermore, for perceptron-capacity problems, we identify a symmetry that allows derivation of correct capacities through a naïve method.
| Original language | English |
|---|---|
| Journal | SciPost Physics Lecture Notes |
| Volume | 105 |
| DOIs | |
| State | Published - 2025 |
Bibliographical note
Publisher Copyright:Copyright D. G. Clark and H. Sompolinsky.
Fingerprint
Dive into the research topics of 'Simplified derivations for high-dimensional convex learning problems'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver