Physics Collision

Some math patterns are so complex that even a perfect AI will never be able to learn them.

April 17, 2026

Original Paper

Arithmetic functions and learning theory

W. Burstein, A. Iosevich, A. Sant

arXiv · 2604.14482

The Takeaway

We usually think that with enough data, a smart enough algorithm can find the pattern in anything. This paper proves that's a lie for the Möbius function, a cornerstone of number theory that helps us understand prime numbers. It shows that even with massive amounts of random samples, an algorithm needs a nearly impossible number of tries—represented by Ω(R)—to actually 'get' it. These patterns are essentially indistinguishable from random noise to a computer, even though they are strictly deterministic. It highlights a fundamental blind spot in machine learning: some truths are baked so deep into the logic of numbers that they are effectively invisible to machines. This suggests there are secrets in our data that we might never unlock, no matter how powerful our computers get.

From the abstract

We establish a connection between analytic number theory and computational learning theory by showing that the Möbius function belongs to a class of functions that is statistically hard to learn from random samples. Let $\mu_R$ denote the restriction of the Möbius function to the squarefree integers in $\{1,\dots,R\}$. Using a recent lower bound of Pandey and Radziwiłł for the $L^1$ norm of exponential sums with Möbius coefficients, we prove that \[ \FR(\mu_R) \gg R^{-1/4-\epsilon} \] for every