SeriesFusion
Science, curated & edited by AI
Nature Is Weird  /  AI

AI generated sentences collapse in perplexity when their words are shuffled while human writing remains stubbornly stable.

Large language models create text with a specific structural fragility that humans do not share. Scrambling the word order of an AI response causes a massive and predictable shift in how the model perceives that text. Human prose exhibits a natural resilience to this shuffling because it relies on deeper semantic foundations. This zero-shot detection method exploits this physical difference to catch machine-generated text without any training on specific datasets. It provides a way to verify content authenticity that is much harder for future models to spoof through simple prompt engineering.

Original Paper

Luminol-AIDetect: Fast Zero-shot Machine-Generated Text Detection based on Perplexity under Text Shuffling

Lucio La Cava, Andrea Tagarelli

arXiv  ·  2604.25860

Machine-generated text (MGT) detection requires identifying structurally invariant signals across generation models, rather than relying on model-specific fingerprints. In this respect, we hypothesize that while large language models excel at local semantic consistency, their autoregressive nature results in a specific kind of structural fragility compared to human writing. We propose Luminol-AIDetect, a novel, zero-shot statistical approach that exposes this fragility through coherence disrupti