TY - JOUR
T1 - Learning temporal sequences by local synaptic changes
AU - Metzger, Y.
AU - Lehmann, D.
PY - 1990
Y1 - 1990
N2 - The authors consider fully connected neural networks in which synaptic weights are continuously slightly updated according to the activity of their pre- and post-synaptic neurons. The synaptic weights change slowly and therefore, at any time, the network behaves as if the weights were constant. Nevertheless, over long periods, the synaptic changes accumulate to produce significant changes in the behaviour of the network. The authors show that simple synaptic changes may produce networks capable of complex dynamical behaviour in the presence of noise. In particular, if such a network is trained by an external stimulus that forces it to go through a fixed sequence of states, it will, after sufficient training, be capable of going through the same sequence of states on its own initiative, without any external driving. The authors assume that their networks have only one mode of operation, i.e., there is no distinction between the training and the retrieval mode, except for the values of the external stimulus. Therefore, synaptic modifications take place also during the retrieval of temporal sequences. The modifications brought about by the spontaneous retrieval may improve the performance of the network. The network obtained after prolonged training by an external stimulus is very similar to the one described by Buhmann and Schulten (1988), when non-overlapping sparse patterns are considered. The biological plausibility of the rules governing the synaptic changes is discussed.
AB - The authors consider fully connected neural networks in which synaptic weights are continuously slightly updated according to the activity of their pre- and post-synaptic neurons. The synaptic weights change slowly and therefore, at any time, the network behaves as if the weights were constant. Nevertheless, over long periods, the synaptic changes accumulate to produce significant changes in the behaviour of the network. The authors show that simple synaptic changes may produce networks capable of complex dynamical behaviour in the presence of noise. In particular, if such a network is trained by an external stimulus that forces it to go through a fixed sequence of states, it will, after sufficient training, be capable of going through the same sequence of states on its own initiative, without any external driving. The authors assume that their networks have only one mode of operation, i.e., there is no distinction between the training and the retrieval mode, except for the values of the external stimulus. Therefore, synaptic modifications take place also during the retrieval of temporal sequences. The modifications brought about by the spontaneous retrieval may improve the performance of the network. The network obtained after prolonged training by an external stimulus is very similar to the one described by Buhmann and Schulten (1988), when non-overlapping sparse patterns are considered. The biological plausibility of the rules governing the synaptic changes is discussed.
UR - http://www.scopus.com/inward/record.url?scp=42149197212&partnerID=8YFLogxK
U2 - 10.1088/0954-898X_1_2_004
DO - 10.1088/0954-898X_1_2_004
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:42149197212
SN - 0954-898X
VL - 1
SP - 169
EP - 188
JO - Network: Computation in Neural Systems
JF - Network: Computation in Neural Systems
IS - 2
ER -