Shifts retrieval from static contrastive vector alignment to dynamic reasoning trajectories using a generative model (T1) and GRPO.
March 19, 2026
Original Paper
CRE-T1 Preview Technical Report: Beyond Contrastive Learning for Reasoning-Intensive Retrieval
arXiv · 2603.17387
The Takeaway
Challenges the decade-long dominance of contrastive learning (BERT-style) in IR by treating retrieval as a reasoning task. This allows models to bridge vocabulary gaps and implicit relationships that fixed embeddings miss.
From the abstract
The central challenge of reasoning-intensive retrieval lies in identifying implicitreasoning relationships between queries and documents, rather than superficial se-mantic or lexical similarity. The contrastive learning paradigm is fundamentallya static representation consolidation technique: during training, it encodes hier-archical relevance concepts into fixed geometric structures in the vector space,and at inference time it cannot dynamically adjust relevance judgments accord-ing to the spec