AI & ML Efficiency Breakthrough

SleepGate introduces a biologically inspired 'sleep cycle' for the KV cache to resolve proactive interference in long-context LLMs.

March 17, 2026

Original Paper

Learning to Forget: Sleep-Inspired Memory Consolidation for Resolving Proactive Interference in Large Language Models

Ying Xie

arXiv · 2603.14517

The Takeaway

It solves the problem where outdated information in the context window degrades retrieval accuracy by using a learned forgetting gate and consolidation module. This reduces the interference horizon from O(n) to O(log n), making long-horizon reasoning significantly more robust.

From the abstract

Large language models (LLMs) suffer from proactive interference (PI): outdated information in the context window disrupts retrieval of current values. This interference degrades retrieval accuracy log-linearly as stale associations accumulate, a bottleneck that persists regardless of context length and resists prompt-engineering mitigations. Biological brains resolve an analogous challenge through sleep-dependent memory consolidation: synaptic downscaling, selective replay, and targeted forgetti