SeriesFusion
Science, curated & edited by AI
Paradigm Challenge  /  AI

A new kind of spiking neural network can learn complex tasks without using the backpropagation algorithm that powers almost all modern AI.

Backpropagation is the standard way to train AI, but it requires massive amounts of energy and memory. This breakthrough uses local plasticity and neuromodulation to train deep networks in a way that mimics biological brains. By bypassing surrogate gradients, these systems can run on specialized hardware with a fraction of the power required by traditional GPUs. This architecture proves that high-level learning is possible through strictly local interactions between neurons. Future hardware could be far more efficient by ditching the global calculations that define current machine learning.

Original Paper

Scalable Learning in Structured Recurrent Spiking Neural Networks without Backpropagation

Bo Tang, Weiwei Xie

arXiv  ·  2605.00402

Spiking Neural Networks (SNNs) provide a promising framework for energy-efficient and biologically grounded computation; however, scalable learning in deep recurrent architectures with sparse connectivity remains a major challenge. In this work, we propose a structured multi-layer recurrent SNN architecture composed of locally dense recurrent layers augmented with sparse small-world long-range projections to a readout population. The long-range connectivity is largely fixed, preserving routing e