YoloFS is a new filesystem designed specifically to stop AI agents from accidentally deleting your life's work.
April 16, 2026
Original Paper
Don't Let AI Agents YOLO Your Files: Shifting Information and Control to Filesystems for Agent Safety and Autonomy
arXiv · 2604.13536
The Takeaway
As we give AI agents access to our local files, the risk of a 'YOLO' command deleting everything grows. YoloFS solves this by moving safety into the OS itself. It uses a staging-and-snapshot system designed specifically for agentic workflows, providing an automatic 'undo' button that the agent can't override. This is the missing link for AI autonomy: a sandbox that doesn't feel like a sandbox. Instead of trusting an LLM to be 'careful,' you build a filesystem that assumes it will make mistakes. This unlocks safe, full-system autonomy for AI developers and power users.
From the abstract
AI coding agents operate directly on users' filesystems, where they regularly corrupt data, delete files, and leak secrets. Current approaches force a tradeoff between safety and autonomy: unrestricted access risks harm, while frequent permission prompts burden users and block agents. To understand this problem, we conduct the first systematic study of agent filesystem misuse, analyzing 290 public reports across 13 frameworks. Our analysis reveals that today's agents have limited information abo