SeriesFusion
Science, curated & edited by AI
Paradigm Challenge  /  AI

Relevant background information actually slashes AI performance by nearly half on certain complex design tasks.

Multi-agent systems often perform 46% worse when they are given the exact context they need to solve a problem. Everyone assumes that more relevant data leads to better decisions, but high-quality documents can actually act as a mental anchor that kills creative exploration. In some cases, irrelevant documents actually result in better performance than highly relevant ones. This phenomenon shows that context functions like a distractor that prevents agents from searching for more efficient solutions. Designers must now reconsider the standard practice of flooding agents with helpful information if they want to maintain high levels of innovation.

Original Paper

When Context Hurts: The Crossover Effect of Knowledge Transfer on Multi-Agent Design Exploration

Saranyan Vigraham

arXiv  ·  2605.04361

The prevailing assumption in agent orchestration is that more context is better. We test this on multi-agent software design across 10 tasks, 7 context-injection conditions, and over 2,700 runs, and find a crossover effect: the same artifact type improves design exploration on some tasks (up to 20$\times$ tradeoff coverage) and actively degrades it on others (up to 46% reduction). On several tasks, an irrelevant document performs as well as or better than every relevant artifact. The direction i