AI & ML Scaling Insight

Neural collapse is triggered by a predictable 'feature-norm threshold' (fn*) that is invariant to training conditions, serving as a new diagnostic for training progress.

April 2, 2026

Original Paper

Neural Collapse Dynamics: Depth, Activation, Regularisation, and Feature Norm Threshold

Anamika Paul Rupa

arXiv · 2604.00230

The Takeaway

This identifies a concrete, actionable metric to predict exactly when representational reorganization occurs in deep networks. It allows practitioners to monitor training dynamics beyond loss curves, identifying the specific point where a model transitions from noise to structured feature learning.

From the abstract

Neural collapse (NC) -- the convergence of penultimate-layer features to a simplex equiangular tight frame -- is well understood at equilibrium, but the dynamics governing its onset remain poorly characterised. We identify a simple and predictive regularity: NC occurs when the mean feature norm reaches a model-dataset-specific critical value, fn*, that is largely invariant to training conditions. This value concentrates tightly within each (model, dataset) pair (CV 0.2). Completing the (architec