Learning temporal sequences by local synaptic changes

Y. Metzger*, D. Lehmann

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

The authors consider fully connected neural networks in which synaptic weights are continuously slightly updated according to the activity of their pre- and post-synaptic neurons. The synaptic weights change slowly and therefore, at any time, the network behaves as if the weights were constant. Nevertheless, over long periods, the synaptic changes accumulate to produce significant changes in the behaviour of the network. The authors show that simple synaptic changes may produce networks capable of complex dynamical behaviour in the presence of noise. In particular, if such a network is trained by an external stimulus that forces it to go through a fixed sequence of states, it will, after sufficient training, be capable of going through the same sequence of states on its own initiative, without any external driving. The authors assume that their networks have only one mode of operation, i.e., there is no distinction between the training and the retrieval mode, except for the values of the external stimulus. Therefore, synaptic modifications take place also during the retrieval of temporal sequences. The modifications brought about by the spontaneous retrieval may improve the performance of the network. The network obtained after prolonged training by an external stimulus is very similar to the one described by Buhmann and Schulten (1988), when non-overlapping sparse patterns are considered. The biological plausibility of the rules governing the synaptic changes is discussed.

Original languageEnglish
Pages (from-to)169-188
Number of pages20
JournalNetwork: Computation in Neural Systems
Volume1
Issue number2
DOIs
StatePublished - 1990

Fingerprint

Dive into the research topics of 'Learning temporal sequences by local synaptic changes'. Together they form a unique fingerprint.

Cite this