AI & ML Paradigm Shift

The Physics-Guided Transformer (PGT) embeds physical priors (like diffusion and causality) directly into the self-attention mechanism via heat-kernel biases.

March 31, 2026

Original Paper

Physics-Guided Transformer (PGT): Physics-Aware Attention Mechanism for PINNs

Ehsan Zeraatkar, Rodion Podorozhny, Jelena Tešić

arXiv · 2603.27929

The Takeaway

It moves beyond treating physics as a soft penalty in the loss function (PINNs) and instead builds physical constraints into the architecture's inductive bias. This results in significantly more stable and physically consistent field reconstructions with sparse data.

From the abstract

Reconstructing continuous physical fields from sparse, irregular observations is a central challenge in scientific machine learning, particularly for systems governed by partial differential equations (PDEs). Existing physics-informed methods typically enforce governing equations as soft penalty terms during optimization, often leading to gradient imbalance, instability, and degraded physical consistency under limited data. We introduce the Physics-Guided Transformer (PGT), a neural architecture