Killeen Benjamin D, Winter Jonas, Gu Wenhao, Martin-Gomez Alejandro, Taylor Russell H, Osgood Greg, Unberath Mathias
Laboratory for Computational Sensing and Robotics, Johns Hopkins University, Baltimore, MD, USA.
Department of Orthopaedic Surgery, Johns Hopkins Hospital, Baltimore, MD, USA.
Comput Methods Biomech Biomed Eng Imaging Vis. 2023;11(4):1130-1135. doi: 10.1080/21681163.2022.2154272. Epub 2022 Dec 7.
Robotic X-ray C-arm imaging systems can precisely achieve any position and orientation relative to the patient. Informing the system, however, what pose exactly corresponds to a desired view is challenging. Currently these systems are operated by the surgeon using joysticks, but this interaction paradigm is not necessarily effective because users may be unable to efficiently actuate more than a single axis of the system simultaneously. Moreover, novel robotic imaging systems, such as the Brainlab Loop-X, allow for independent source and detector movements, adding even more complexity. To address this challenge, we consider complementary interfaces for the surgeon to command robotic X-ray systems effectively. Specifically, we consider three interaction paradigms: (1) the use of a pointer to specify the principal ray of the desired view relative to the anatomy, (2) the same pointer, but combined with a mixed reality environment to synchronously render digitally reconstructed radiographs from the tool's pose, and (3) the same mixed reality environment but with a virtual X-ray source instead of the pointer. Initial human-in-the-loop evaluation with an attending trauma surgeon indicates that mixed reality interfaces for robotic X-ray system control are promising and may contribute to substantially reducing the number of X-ray images acquired solely during "fluoro hunting" for the desired view or standard plane.
机器人X射线C形臂成像系统能够精确地实现相对于患者的任何位置和方位。然而,告知系统何种姿势 exactly 对应于期望的视图具有挑战性。目前,这些系统由外科医生使用操纵杆操作,但这种交互范式不一定有效,因为用户可能无法同时有效地驱动系统的多个轴。此外,新型机器人成像系统,如Brainlab Loop-X,允许源和探测器独立移动,这增加了更多的复杂性。为应对这一挑战,我们考虑为外科医生提供互补接口,以有效地指挥机器人X射线系统。具体而言,我们考虑三种交互范式:(1) 使用指针相对于解剖结构指定期望视图的主射线,(2) 相同的指针,但与混合现实环境相结合,以从工具姿势同步渲染数字重建射线照片,以及(3) 相同的混合现实环境,但使用虚拟X射线源代替指针。与主治创伤外科医生进行的初步人在回路评估表明,用于机器人X射线系统控制的混合现实接口很有前景,可能有助于大幅减少仅在为获得期望视图或标准平面而进行的“荧光搜索”期间获取的X射线图像数量。