AI & ML Paradigm Shift

A novel neural primitive based on metriplectic dynamics that outperforms Transformers in data efficiency and generalization.

April 1, 2026

Original Paper

Metriplector: From Field Theory to Neural Architecture

Dan Oprisa, Peter Toth

arXiv · 2603.29496

The Takeaway

It replaces standard linear/nonlinear layers with coupled physical field dynamics, showing zero-shot generalization from 15x15 to 39x39 grids and 3.6x better training efficiency in language modeling. It suggests a future where computation is defined by the evolution of abstract physical systems rather than static layer stacks.

From the abstract

We present Metriplector, a neural architecture primitive in which the input configures an abstract physical system -- fields, sources, and operators -- and the dynamics of that system is the computation. Multiple fields evolve via coupled metriplectic dynamics, and the stress-energy tensor $T^{\mu\nu}$, derived from Noether's theorem, provides the readout. The metriplectic formulation admits a natural spectrum of instantiations: the dissipative branch alone yields a screened Poisson equation sol