MeMix is a training-free, plug-and-play module that reduces 3D reconstruction error by up to 40% in long sequences by mitigating state drift.
March 17, 2026
Original Paper
MeMix: Writing Less, Remembering More for Streaming 3D Reconstruction
arXiv · 2603.15330
The Takeaway
It recasts the recurrent state of 3D reconstruction models into a 'Memory Mixture' that selectively updates only least-aligned patches. This prevents the catastrophic forgetting and drift that plague long-sequence streaming reconstruction without requiring any fine-tuning.
From the abstract
Reconstruction is a fundamental task in 3D vision and a fundamental capability for spatial intelligence. Particularly, streaming 3D reconstruction is central to real-time spatial perception, yet existing recurrent online models often suffer from progressive degradation on long sequences due to state drift and forgetting, motivating inference-time remedies. We present MeMix, a training-free, plug-and-play module that improves streaming reconstruction by recasting the recurrent state into a Memory