SeriesFusion
Science, curated & edited by AI
Practical Magic  /  Biology

A robotic ultrasound probe can now locate internal organs using nothing but a single photo of the skin.

Medical robots typically need a pre-existing MRI or CT scan to know where to start looking for an internal organ. This new semantic mapping engine allows a robot to visualize the internal layout of a patient by analyzing their external anatomy. It eliminates the need for expensive preparatory imaging or a human technician to guide the probe to the right spot. The system can initialize a high-fidelity scan autonomously, making advanced diagnostics accessible in remote areas or emergency rooms. This technology effectively gives robots the intuition that experienced doctors use to navigate the human body.

Original Paper

SAMe: A Semantic Anatomy Mapping Engine for Robotic Ultrasound

Jing Zhang, Duojie Chen, Wentao Jiang, Zihan Lou, Jianxin Liu, Xinwu Cui, Qinghong Zhao, Bo Du, Christoph F. Dietrich, Dacheng Tao

arXiv  ·  2604.25646

Robotic ultrasound has advanced local image-driven control, contact regulation, and view optimization, yet current systems lack the anatomical understanding needed to determine what to scan, where to begin, and how to adapt to individual patient anatomy. These gaps make systems still reliant on expert intervention to initiate scanning. Here we present SAMe, a semantic anatomy mapping engine that provides robotic ultrasound with an explicit anatomical prior layer. SAMe addresses scan initiation a