SeriesFusion
Science, curated & edited by AI
Paradigm Challenge  /  AI

AI solves math problems much better when it remembers how it solved them before, rather than just reading a textbook.

Retrieval-Augmented Generation (RAG) significantly improves reasoning when it retrieves thinking traces instead of static documents. These traces are step-by-step records of how the system approached a problem in the past. This challenges the standard view that RAG is only for looking up facts and is useless for pure logic. It proves that how to think is a retrievable asset that can be shared across different tasks. This approach allows smaller, weaker models to perform like much larger ones just by following a better thought process.

Original Paper

RAG over Thinking Traces Can Improve Reasoning Tasks

Negar Arabzadeh, Wenjie Ma, Sewon Min, Matei Zaharia

arXiv  ·  2605.03344

Retrieval-augmented generation (RAG) has proven effective for knowledge-intensive tasks, but is widely believed to offer limited benefit for reasoning-intensive problems such as math and code generation. We challenge this assumption by showing that the limitation lies not in RAG itself, but in the choice of corpus. Instead of retrieving documents, we propose retrieving thinking traces, i.e., intermediate thinking trajectories generated during problem solving attempts. We show that thinking trace