AI & ML Practical Magic

You can physically crash a machine just by feeding its AI some bad data, even if you have no idea how the machine actually works.

April 10, 2026

Original Paper

Data Poisoning Attacks Can Systematically Destabilize Data-Driven Control Synthesis

Vijayanand Digge, Martina Vanelli, Ahmad W. Al-Dabbagh, Julien M. Hendrickx, Gianluca Bianchin

arXiv · 2604.08392

The Takeaway

The study exposes a severe vulnerability in 'black-box' controllers where an attacker can systematically induce failure through data alone. This shifts the focus of control system security from protecting the hardware to securing the entire data pipeline.

From the abstract

Data-driven control has emerged as a powerful paradigm for synthesizing controllers directly from data, bypassing explicit model identification. However, this reliance on data introduces new and largely unexplored vulnerabilities. In this paper, we show that an attacker can systematically poison the data used for control synthesis, causing any linear state-feedback controller synthesized by the planner to destabilize the physical system. Concerningly, we show that the attacker can achieve this o