New 360-degree video treats things on screen like they have gravity, just so it can predict exactly where you're gonna look next.
March 24, 2026
Original Paper
OrbitStream: Training-Free Adaptive 360-degree Video Streaming via Semantic Potential Fields
arXiv · 2603.20999
The Takeaway
Instead of using complex AI to monitor your personal habits, this system uses 'gravitational potential fields' to pull the video's focus toward interesting objects like people or cars. By treating your gaze like an object caught in a gravity well, the stream can load high-quality details exactly where you are about to look before you even get there.
From the abstract
Adaptive 360° video streaming for teleoperation faces dual challenges: viewport prediction under uncertain gaze patterns and bitrate adaptation over volatile wireless channels. While data-driven and Deep Reinforcement Learning (DRL) methods achieve high Quality of Experience (QoE), their "black-box" nature and reliance on training data can limit deployment in safety-critical systems. To address this, we propose OrbitStream, a training-free framework that combines semantic scene understanding wit