A faster Forward-Forward algorithm flies through the ImageNet dataset 40 times quicker than the original version.
Backpropagation has dominated AI training for decades but requires massive memory and energy for its backward pass. This bio-inspired method enables single-pass training and inference by projecting data onto a hypersphere. Researchers previously struggled to make these alternatives scale to complex real-world images. This implementation finally bridges that gap and offers a computationally efficient path for hardware that cannot support traditional gradients. Energy-constrained edge devices could soon train complex models locally without the overhead of modern optimization engines.
Hyperspherical Forward-Forward with Prototypical Representations
arXiv · 2605.00082
The Forward-Forward (FF) algorithm presents a compelling, bio-inspired alternative to backpropagation. However, while efficient in training, it has a computationally prohibitive inference process that requires a separate forward pass for every class that is evaluated. In this work, we introduce the Hyperspherical Forward-Forward (HFF), a novel reformulation that resolves this critical bottleneck. Our core innovation is to reframe the local objective of each layer from a binary goodness-of-fit ta