Two-factor synaptic consolidation reconciles robust memory with pruning and homeostatic scaling
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
Memory consolidation involves a process of engram reorganization and stabilization that is thought to occur primarily during sleep through a combination of neural replay, homeostatic plasticity, synaptic maturation, and pruning. From a computational perspective, however, this process remains puzzling, as it is unclear how the underlying mechanisms can be incorporated into a common mathematical model of learning and memory. Here, we propose a solution by deriving a consolidation model that uses replay and two-factor synapses to store memories in recurrent neural networks with sparse connectivity and maximal noise robustness. The model offers a unified account of experimental observations of consolidation, such as multiplicative homeostatic scaling, task-driven synaptic pruning, increased neural stimulus selectivity, and preferential strengthening of weak memories. The model further predicts that intrinsic synaptic noise scales sublinearly with synaptic strength; this is supported by a meta-analysis of published synaptic imaging datasets.