Training an AI across a thousand separate computers now yields the exact same accuracy as if all the data were stored on one giant server.
April 25, 2026
Original Paper
Decentralized Machine Learning with Centralized Performance Guarantees via Gibbs Algorithms
arXiv · 2604.20492
The Takeaway
Developers have long accepted a performance tax where decentralized learning is always slightly worse than centralized training. This new application of Gibbs algorithms proves that this gap is not a mathematical necessity. The system achieves centralized-level results without ever requiring the participating devices to share their private datasets. This removes the final excuse for large corporations to hoard user data in central repositories for the sake of model quality. It enables a new era of high-performance AI that respects user privacy by design.
From the abstract
In this paper, it is shown, for the first time, that centralized performance is achievable in decentralized learning without sharing the local datasets. Specifically, when clients adopt an empirical risk minimization with relative-entropy regularization (ERM-RER) learning framework and a forward-backward communication between clients is established, it suffices to share the locally obtained Gibbs measures to achieve the same performance as that of a centralized ERM-RER with access to all the dat