AI & ML Paradigm Shift

Logical reasoning in LLMs is causally linked to 'algebraic divergence' in the residual stream, and failure to achieve this geometry explains sycophancy.

March 26, 2026

Original Paper

The Geometric Price of Discrete Logic: Context-driven Manifold Dynamics of Number Representations

Long Zhang, Dai-jun Lin, Wei-neng Chen

arXiv · 2603.23577

The Takeaway

It moves beyond linear-isometric theory to show that models must 'topologically distort' their internal manifolds to forge logical boundaries. This provides a concrete geometric diagnostic for why models hallucinate or cave to social pressure under specific prompts.

From the abstract

Large language models (LLMs) generalize smoothly across continuous semantic spaces, yet strict logical reasoning demands the formation of discrete decision boundaries. Prevailing theories relying on linear isometric projections fail to resolve this fundamental tension. In this work, we argue that task context operates as a non-isometric dynamical operator that enforces a necessary "topological distortion." By applying Gram-Schmidt decomposition to residual-stream activations , we reveal a dual-m