Interleaved Replay of Novel and Familiar Memory Traces During Slow-Wave Sleep Prevents Catastrophic Forgetting

Read the full article See related articles

Discuss this preprint

Start a discussion What are Sciety discussions?

Listed in

This article is not in any list yet, why not save it to one of your lists.
Log in to save this article

Abstract

Humans and animals can learn continuously, acquiring new knowledge while integrating it into a lifelong memory pool. In contrast, artificial neural networks (ANNs) suffer from catastrophic forgetting, where new training disrupts existing memories. This issue can be alleviated in ANNs by interleaving training on new tasks with past data; however, whether the brain uses a similar strategy is unknown. In this work, we show that slow-wave sleep interleaves replay of familiar and novel (i.e. hippocampal-dependent) memory traces within individual slow waves, allowing new memories to integrate into the existing cortical pool without interference. This study presents a novel theory for how memory traces acquired across an animal’s life are organized within the cortical-hippocampal system to support continual learning and suggests novel principles for a broad range of continual learning AI.

Article activity feed