SeriesFusion
Science, curated & edited by AI
Paradigm Challenge  /  Society

AI companies might have no legal power to stop people from using their models to train better, cheaper competitors.

Industry standard anti-distillation clauses that prohibit using AI outputs for training may be legally void under existing laws. Copyright preemption and antitrust regulations provide several pathways to invalidate these restrictive contracts. Many tech giants assume that they can lock in their dominance by contractually forbidding the use of their data by others. This legal argument suggests that these barriers are fragile and likely to be overturned in court. Competition in the AI field could explode if these hidden legal protections are removed.

Original Paper

Five Doctrinal Pathways to Invalidate the Anti-Distillation Clause

Gaston Rey

SSRN  ·  6700818

On 30 April 2026, in cross-examination during the trial of Musk v. Altman in the Northern District of California, Elon Musk admitted that xAI had "partly" distilled OpenAI's models to train Grok, characterising the practice as standard industry conduct. The dominant frame in commentary treats the admission as a contractual problem: API Terms of Service prohibit using outputs to develop competing models, the conduct occurred, breach is established. <div> This paper argues that the more interestin