AI & ML Efficiency Breakthrough

Implements bio-inspired 'mental-state dynamics' to achieve O(N) complexity in Vision Transformers.

March 17, 2026

Original Paper

Scalable Machines with Intrinsic Higher Mental-State Dynamics

Ahsan Adeel, M. Bilal

arXiv · 2603.13453

The Takeaway

By using triadic modulation loops to pre-select information before attention is applied, this architecture achieves significantly faster learning on ImageNet-1K with fewer heads and tokens than standard ViTs, offering a scalable path for high-resolution vision tasks.

From the abstract

Drawing on recent breakthroughs in cellular neurobiology and detailed biophysical modeling linking neocortical pyramidal neurons to distinct mental-state regimes, this work introduces a mathematically grounded formulation showing how models (e.g., Transformers) can implement computational principles underlying awake imaginative thought to pre-select relevant information before attention is applied via triadic modulation loops among queries ($Q$), keys ($K$), and values ($V$).~Scalability experim