Challenges the 80-year-old assumption that neurons must use weighted summation as their primary aggregation mechanism.
March 23, 2026
Original Paper
Beyond Weighted Summation: Learnable Nonlinear Aggregation Functions for Robust Artificial Neurons
arXiv · 2603.19344
The Takeaway
By replacing linear sums with learnable nonlinear aggregators (F-Mean and Gaussian Support), the authors achieve significantly higher robustness to noise (0.99 vs 0.89) without sacrificing trainability. This opens a new architectural dimension for robust deep learning.
From the abstract
Weighted summation has remained the default input aggregation mechanism in artificial neurons since the earliest neural network models. While computationally efficient, this design implicitly behaves like a mean-based estimator and is therefore sensitive to noisy or extreme inputs. This paper investigates whether replacing fixed linear aggregation with learnable nonlinear alternatives can improve neural network robustness without sacrificing trainability. Two differentiable aggregation mechanism