Stochastic consolidation of lifelong memory

Nimrod Shaham, Jay Chandra, Gabriel Kreiman, Haim Sompolinsky*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

Humans have the remarkable ability to continually store new memories, while maintaining old memories for a lifetime. How the brain avoids catastrophic forgetting of memories due to interference between encoded memories is an open problem in computational neuroscience. Here we present a model for continual learning in a recurrent neural network combining Hebbian learning, synaptic decay and a novel memory consolidation mechanism: memories undergo stochastic rehearsals with rates proportional to the memory’s basin of attraction, causing self-amplified consolidation. This mechanism gives rise to memory lifetimes that extend much longer than the synaptic decay time, and retrieval probability of memories that gracefully decays with their age. The number of retrievable memories is proportional to a power of the number of neurons. Perturbations to the circuit model cause temporally-graded retrograde and anterograde deficits, mimicking observed memory impairments following neurological trauma.

Original languageEnglish
Article number13107
JournalScientific Reports
Volume12
Issue number1
DOIs
StatePublished - Dec 2022

Bibliographical note

Publisher Copyright:
© 2022, The Author(s).

Fingerprint

Dive into the research topics of 'Stochastic consolidation of lifelong memory'. Together they form a unique fingerprint.

Cite this