AI & ML Efficiency Breakthrough

Q-Drift corrects quantization-induced noise in diffusion models using a plug-and-play sampler adjustment that requires only 5 calibration runs.

March 20, 2026

Original Paper

Q-Drift: Quantization-Aware Drift Correction for Diffusion Model Sampling

Sooyoung Ryu, Mathieu Salzmann, Saqib Javed

arXiv · 2603.18095

The Takeaway

Quantizing diffusion models for edge deployment often results in severe FID degradation; Q-Drift treats this as a stochastic drift and corrects it during sampling. It achieves significant quality gains (up to 4.59 FID reduction) with near-zero inference overhead, making low-bit (W3/W4) diffusion models feasible for production.

From the abstract

Post-training quantization (PTQ) is a practical path to deploy large diffusion models, but quantization noise can accumulate over the denoising trajectory and degrade generation quality. We propose Q-Drift, a principled sampler-side correction that treats quantization error as an implicit stochastic perturbation on each denoising step and derives a marginal-distribution-preserving drift adjustment. Q-Drift estimates a timestep-wise variance statistic from calibration, in practice requiring as fe