AI & ML Paradigm Shift

Introduces geometry-aware parallel refinement for diffusion language models, bypassing fixed-block decoding limitations.

March 31, 2026

Original Paper

GeoBlock: Inferring Block Granularity from Dependency Geometry in Diffusion Language Models

Lipeng Wan, Junjie Ma, Jianhui Gu, Zeyang Liu, Xuyang Lu, Xuguang Lan

arXiv · 2603.26675

The Takeaway

It moves diffusion decoding from heuristic block sizes to 'dependency geometry,' where semantically cohesive regions are updated in parallel while causal regions remain sequential. This improves the speed-accuracy trade-off of diffusion LMs without requiring retraining.

From the abstract

Block diffusion enables efficient parallel refinement in diffusion language models, but its decoding behavior depends critically on block size. Existing block-sizing strategies rely on fixed rules or heuristic signals and do not account for the dependency geometry that determines which tokens can be safely refined together. This motivates a geometry view of diffusion decoding: \emph{regions with strong causal ordering require sequential updates, whereas semantically cohesive regions admit parall