AI has hit a wall, and it's because data is acting like a heavy anchor slowing the whole thing down.
March 30, 2026
Original Paper
Data Gravity and the Energy Limits of Computation
arXiv · 2603.26053
AI-generated illustration
The Takeaway
Moving information inside a computer uses significantly more energy than the actual 'thinking' process, creating a physical bottleneck called data gravity. This energy limit suggests that AI scaling may soon hit a plateau unless we completely redesign chips to mimic the dense, energy-efficient structure of the human brain.
From the abstract
Unlike the von Neumann architecture, which separates computation from memory, the brain tightly integrates them, an organization that large language models increasingly resemble. The crucial difference lies in the ratio of energy spent on computation versus data access: in the brain, most energy fuels compute, while in von Neumann architectures, data movement dominates. To capture this imbalance, we introduce the \emph{operation-operand disjunction constant} $G_d$, a dimensionless measure of the