AI & ML Efficiency Breakthrough

Achieves competitive continual learning accuracy with a 90% reduction in memory cost.

March 31, 2026

Original Paper

Mitigating Forgetting in Continual Learning with Selective Gradient Projection

Anika Singh, Aayush Dhaulakhandi, Varun Chopade, Likhith Malipati, David Martinez, Kevin Zhu

arXiv · 2603.26671

AI-generated illustration

The Takeaway

Catastrophic forgetting usually requires large memory buffers for replay; this method uses selective gradient projection and per-layer gating to stabilize learning without the overhead. This is critical for deploying adaptive models on resource-constrained edge devices.

From the abstract

As neural networks are increasingly deployed in dynamic environments, they face the challenge of catastrophic forgetting, the tendency to overwrite previously learned knowledge when adapting to new tasks, resulting in severe performance degradation on earlier tasks. We propose Selective Forgetting-Aware Optimization (SFAO), a dynamic method that regulates gradient directions via cosine similarity and per-layer gating, enabling controlled forgetting while balancing plasticity and stability. SFAO