Gaussian Splatting just gave radar 'eyes,' enabling high-fidelity 3D mapping in total darkness and smoke.
April 16, 2026
Original Paper
RadarSplat-RIO: Indoor Radar-Inertial Odometry with Gaussian Splatting-Based Radar Bundle Adjustment
arXiv · 2604.13492
The Takeaway
Radar is great for seeing through things but has always been too noisy for the beautiful 3D maps we get from cameras. RadarSplat-RIO changes that by using Gaussian Splatting to jointly optimize sensor poses and scene geometry. This brings the precision of visual SLAM to a sensor that works in environments where cameras are blind, like heavy smoke or pitch blackness. This is a game-changer for autonomous vehicles and rescue robots. We can now build systems that reconstruct detailed 3D rooms using only radar data. It effectively bridges the gap between 'seeing' and 'sensing' for the first time.
From the abstract
Radar is more resilient to adverse weather and lighting conditions than visual and Lidar simultaneous localization and mapping (SLAM). However, most radar SLAM pipelines still rely heavily on frame-to-frame odometry, which leads to substantial drift. While loop closure can correct long-term errors, it requires revisiting places and relies on robust place recognition. In contrast, visual odometry methods typically leverage bundle adjustment (BA) to jointly optimize poses and map within a local wi