Suppr超能文献

用于通过机器人X射线系统实现所需视图的混合现实接口。

Mixed Reality Interfaces for Achieving Desired Views with Robotic X-ray Systems.

作者信息

Killeen Benjamin D, Winter Jonas, Gu Wenhao, Martin-Gomez Alejandro, Taylor Russell H, Osgood Greg, Unberath Mathias

机构信息

Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA.

Department of Orthopaedic Surgery, Johns Hopkins Hospital, Baltimore, MD, USA.

出版信息

Comput Methods Biomech Biomed Eng Imaging Vis. 2023;11(4):1130-1135. doi: 10.1080/21681163.2022.2154272. Epub 2022 Dec 7.

Abstract

Robotic X-ray C-arm imaging systems can precisely achieve any position and orientation relative to the patient. Informing the system, however, what pose exactly corresponds to a desired view is challenging. Currently these systems are operated by the surgeon using joysticks, but this interaction paradigm is not necessarily effective because users may be unable to efficiently actuate more than a single axis of the system simultaneously. Moreover, novel robotic imaging systems, such as the Brainlab Loop-X, allow for independent source and detector movements, adding even more complexity. To address this challenge, we consider complementary interfaces for the surgeon to command robotic X-ray systems effectively. Specifically, we consider three interaction paradigms: (1) the use of a pointer to specify the principal ray of the desired view relative to the anatomy, (2) the same pointer, but combined with a mixed reality environment to synchronously render digitally reconstructed radiographs from the tool's pose, and (3) the same mixed reality environment but with a virtual X-ray source instead of the pointer. Initial human-in-the-loop evaluation with an attending trauma surgeon indicates that mixed reality interfaces for robotic X-ray system control are promising and may contribute to substantially reducing the number of X-ray images acquired solely during "fluoro hunting" for the desired view or standard plane.

摘要

机器人X射线C形臂成像系统能够精确地实现相对于患者的任何位置和方位。然而,告知系统何种姿势 exactly 对应于期望的视图具有挑战性。目前,这些系统由外科医生使用操纵杆操作,但这种交互范式不一定有效,因为用户可能无法同时有效地驱动系统的多个轴。此外,新型机器人成像系统,如Brainlab Loop-X,允许源和探测器独立移动,这增加了更多的复杂性。为应对这一挑战,我们考虑为外科医生提供互补接口,以有效地指挥机器人X射线系统。具体而言,我们考虑三种交互范式:(1) 使用指针相对于解剖结构指定期望视图的主射线,(2) 相同的指针,但与混合现实环境相结合,以从工具姿势同步渲染数字重建射线照片,以及(3) 相同的混合现实环境,但使用虚拟X射线源代替指针。与主治创伤外科医生进行的初步人在回路评估表明,用于机器人X射线系统控制的混合现实接口很有前景,可能有助于大幅减少仅在为获得期望视图或标准平面而进行的“荧光搜索”期间获取的X射线图像数量。

相似文献

1
Mixed Reality Interfaces for Achieving Desired Views with Robotic X-ray Systems.用于通过机器人X射线系统实现所需视图的混合现实接口。
Comput Methods Biomech Biomed Eng Imaging Vis. 2023;11(4):1130-1135. doi: 10.1080/21681163.2022.2154272. Epub 2022 Dec 7.
3
Take a shot! Natural language control of intelligent robotic X-ray systems in surgery.来一发!手术中智能机器人 X 射线系统的自然语言控制。
Int J Comput Assist Radiol Surg. 2024 Jun;19(6):1165-1173. doi: 10.1007/s11548-024-03120-3. Epub 2024 Apr 15.
6
Interactive Flying Frustums (IFFs): spatially aware surgical data visualization.交互式飞行截段(IFF):空间感知手术数据可视化。
Int J Comput Assist Radiol Surg. 2019 Jun;14(6):913-922. doi: 10.1007/s11548-019-01943-z. Epub 2019 Mar 12.

引用本文的文献

本文引用的文献

2
Fluoroscopic imaging: New advances.荧光透视成像:新进展。
Injury. 2022 Nov;53 Suppl 3:S8-S15. doi: 10.1016/j.injury.2022.05.035. Epub 2022 May 21.
6
9
Head-Mounted Display Use in Surgery: A Systematic Review.头戴式显示器在手术中的应用:一项系统评价。
Surg Innov. 2020 Feb;27(1):88-100. doi: 10.1177/1553350619871787. Epub 2019 Sep 12.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验