TY - JOUR
T1 - Learning from examples in large neural networks
AU - Sompolinsky, H.
AU - Tishby, N.
AU - Seung, H. S.
PY - 1990
Y1 - 1990
N2 - A statistical-mechanical theory of learning from examples in layered networks at finite temperature is studied. When the training error is a smooth function of continuously varying weights, the generalization error falls off asymptotically as the inverse number of examples. By analytical and numerical studies of single-layer perceptrons, we show that when the weights are discrete, the generalization error can exhibit a discontinuous transition to perfect generalization. For intermediate sizes of the example set, the state of perfect generalization coexists with a metastable spin-glass state.
AB - A statistical-mechanical theory of learning from examples in layered networks at finite temperature is studied. When the training error is a smooth function of continuously varying weights, the generalization error falls off asymptotically as the inverse number of examples. By analytical and numerical studies of single-layer perceptrons, we show that when the weights are discrete, the generalization error can exhibit a discontinuous transition to perfect generalization. For intermediate sizes of the example set, the state of perfect generalization coexists with a metastable spin-glass state.
UR - http://www.scopus.com/inward/record.url?scp=0003797476&partnerID=8YFLogxK
U2 - 10.1103/PhysRevLett.65.1683
DO - 10.1103/PhysRevLett.65.1683
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:0003797476
SN - 0031-9007
VL - 65
SP - 1683
EP - 1686
JO - Physical Review Letters
JF - Physical Review Letters
IS - 13
ER -