AI & ML Paradigm Shift

Continued Fraction Neural Networks (CFNN) introduce a rational inductive bias that handles singularities with 10-100x fewer parameters than standard MLPs.

March 24, 2026

Original Paper

CFNN: Continued Fraction Neural Network

Chao Wang, Xuancheng Zhou, Ruilin Hou, Xiaoyu Cheng, Ruiyi Ding

arXiv · 2603.20634

The Takeaway

By integrating continued fractions into the architecture, this paper provides a 'grey-box' model for scientific computing that is far more parameter-frugal and robust to noise than traditional neural networks when modeling complex physical functions.

From the abstract

Accurately characterizing non-linear functional manifolds with singularities is a fundamental challenge in scientific computing. While Multi-Layer Perceptrons (MLPs) dominate, their spectral bias hinders resolving high-curvature features without excessive parameters. We introduce Continued Fraction Neural Networks (CFNNs), integrating continued fractions with gradient-based optimization to provide a ``rational inductive bias.'' This enables capturing complex asymptotics and discontinuities with