Scientists found a way to make a basic home computer screw up math exactly like a super-expensive AI chip does.
March 24, 2026
Original Paper
Hawkeye: Reproducing GPU-Level Non-Determinism
arXiv · 2603.20421
The Takeaway
AI super-chips calculate numbers so fast that they create tiny, unpredictable rounding errors, meaning the same calculation can produce different results every time. This new system, Hawkeye, can perfectly replicate these 'ghost' errors on a normal processor, finally allowing third parties to audit whether an AI was trained correctly without needing the original super-hardware.
From the abstract
We present Hawkeye, a system for analyzing and reproducing GPU-level arithmetic operations. Using our framework, anyone can re-execute on a CPU the exact matrix multiplication operations underlying a machine learning model training or inference workflow that was executed on an NVIDIA GPU, without any precision loss. This is in stark contrast to prior approaches to verifiable machine learning, which either introduce significant computation overhead to the original model owner, or suffer from non-