AI & ML Efficiency Breakthrough

Scales Maximum Entropy population synthesis from 20 to 50+ categorical attributes by replacing exact expectation sums with Persistent Contrastive Divergence.

March 31, 2026

Original Paper

Scalable Maximum Entropy Population Synthesis via Persistent Contrastive Divergence

Mirko Degli Esposti

arXiv · 2603.27312

The Takeaway

It reduces runtime complexity from exponential to linear, allowing researchers to generate synthetic populations for massive state spaces (growing 18 orders of magnitude) that were previously computationally impossible to solve.

From the abstract

Maximum entropy (MaxEnt) modelling provides a principled framework for generating synthetic populations from aggregate census data, without access to individual-level microdata. The bottleneck of existing approaches is exact expectation computation, which requires summing over the full tuple space $\cX$ and becomes infeasible for more than $K \approx 20$ categorical attributes. We propose \emph{GibbsPCDSolver}, a stochastic replacement for this computation based on Persistent Contrastive Diverge