Vörös Viktor, Li Ruixuan, Davoodi Ayoob, Wybaillie Gauthier, Vander Poorten Emmanuel, Niu Kenan
Robot-Assisted Surgery Group, Department of Mechanical Engineering, KU Leuven, Celestijnenlaan 300, 3000 Leuven, Belgium.
Healthcare Division, Barco NV, Beneluxpark 21, 8500 Kortrijk, Belgium.
J Imaging. 2022 Oct 6;8(10):273. doi: 10.3390/jimaging8100273.
Robot-assisted surgery is becoming popular in the operation room (OR) for, e.g., orthopedic surgery (among other surgeries). However, robotic executions related to surgical steps cannot simply rely on preoperative plans. Using pedicle screw placement as an example, extra adjustments are needed to adapt to the intraoperative changes when the preoperative planning is outdated. During surgery, adjusting a surgical plan is non-trivial and typically rather complex since the available interfaces used in current robotic systems are not always intuitive to use. Recently, thanks to technical advancements in head-mounted displays (HMD), augmented reality (AR)-based medical applications are emerging in the OR. The rendered virtual objects can be overlapped with real-world physical objects to offer intuitive displays of the surgical sites and anatomy. Moreover, the potential of combining AR with robotics is even more promising; however, it has not been fully exploited. In this paper, an innovative AR-based robotic approach is proposed and its technical feasibility in simulated pedicle screw placement is demonstrated. An approach for spatial calibration between the robot and HoloLens 2 without using an external 3D tracking system is proposed. The developed system offers an intuitive AR-robot interaction approach between the surgeon and the surgical robot by projecting the current surgical plan to the surgeon for fine-tuning and transferring the updated surgical plan immediately back to the robot side for execution. A series of bench-top experiments were conducted to evaluate system accuracy and human-related errors. A mean calibration error of 3.61 mm was found. The overall target pose error was 3.05 mm in translation and 1.12∘ in orientation. The average execution time for defining a target entry point intraoperatively was 26.56 s. This work offers an intuitive AR-based robotic approach, which could facilitate robotic technology in the OR and boost synergy between AR and robots for other medical applications.
机器人辅助手术在手术室(OR)中正变得越来越流行,例如在骨科手术(以及其他手术)中。然而,与手术步骤相关的机器人执行不能仅仅依赖于术前计划。以椎弓根螺钉植入为例,当术前计划过时,需要进行额外调整以适应术中变化。在手术过程中,调整手术计划并非易事,而且通常相当复杂,因为当前机器人系统中使用的可用界面并不总是易于使用。最近,由于头戴式显示器(HMD)的技术进步,基于增强现实(AR)的医疗应用正在手术室中出现。渲染的虚拟对象可以与现实世界的物理对象重叠,以提供手术部位和解剖结构的直观显示。此外,将AR与机器人技术相结合的潜力更具前景;然而,它尚未得到充分利用。在本文中,提出了一种创新的基于AR的机器人方法,并展示了其在模拟椎弓根螺钉植入中的技术可行性。提出了一种无需使用外部3D跟踪系统即可在机器人和HoloLens 2之间进行空间校准的方法。所开发的系统通过将当前手术计划投影给外科医生进行微调,并立即将更新后的手术计划传回机器人端进行执行,从而提供了一种外科医生与手术机器人之间直观的AR-机器人交互方法。进行了一系列台式实验以评估系统精度和人为相关误差。发现平均校准误差为3.61毫米。总体目标姿态误差在平移方面为3.05毫米,在方向方面为1.12°。术中定义目标入口点的平均执行时间为26.56秒。这项工作提供了一种直观的基于AR的机器人方法,这可以促进手术室中的机器人技术,并增强AR与机器人在其他医疗应用中的协同作用。