AI & ML Efficiency Breakthrough

IConE enables stable self-supervised learning even at batch size 1, overcoming the memory bottlenecks of high-dimensional scientific and medical data.

March 17, 2026

Original Paper

IConE: Batch Independent Collapse Prevention for Self-Supervised Representation Learning

Konstantinos Almpanakis, Anna Kreshuk

arXiv · 2603.15263

The Takeaway

By decoupling representation collapse prevention from batch statistics using global learnable auxiliary embeddings, it allows training on high-resolution 3D data where large batch sizes are physically impossible due to GPU memory limits.

From the abstract

Self-supervised learning (SSL) has revolutionized representation learning, with Joint-Embedding Architectures (JEAs) emerging as an effective approach for capturing semantic features. Existing JEAs rely on implicit or explicit batch interaction -- via negative sampling or statistical regularization -- to prevent representation collapse. This reliance becomes problematic in regimes where batch sizes must be small, such as high-dimensional scientific data, where memory constraints and class imbala