Can sleep protect memories from catastrophic forgetting?
AbstractContinual learning remains to be an unsolved problem in artificial neural networks. Biological systems have evolved mechanisms by which they can prevent catastrophic forgetting of old knowledge during new training and allow lifelong learning. Building upon data suggesting the importance of sleep in learning and memory, here we test a hypothesis that sleep protects memories from catastrophic forgetting. We found that training in a thalamocortical network model of a “new” memory that interferes with previously stored “old” memory may result in degradation and forgetting of the old memory trace. Simulating NREM sleep immediately after new learning leads to replay, which reverses the damage and ultimately enhances both old and new memory traces. Surprisingly, we found that sleep replay goes beyond recovering old memory traces that were damaged by new learning. When a new memory competes for the neuronal/synaptic resources previously allocated to the old memory, sleep replay changes the synaptic footprint of the old memory trace to allow for the overlapping populations of neurons to store multiple memories. Different neurons become preferentially supporting different memory traces to allow successful recall. We compared synaptic weight dynamics during sleep replay with that during interleaved training – a common approach to overcome catastrophic forgetting in artificial networks – and found that interleaved training promotes synaptic competition and weakening of reciprocal synapses, effectively reducing an ensemble of neurons contributing to memory recall. This leads to suboptimal recall performance compared to that after sleep. Together, our results suggest that sleep provides a powerful mechanism to achieve continual learning by combining consolidation of new memory traces with reconsolidation of old memory traces to minimize memory interference.