Liebmann Florentin, von Atzigen Marco, Stütz Dominik, Wolf Julian, Zingg Lukas, Suter Daniel, Cavalcanti Nicola A, Leoty Laura, Esfandiari Hooman, Snedeker Jess G, Oswald Martin R, Pollefeys Marc, Farshad Mazda, Fürnstahl Philipp
Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland.
Research in Orthopedic Computer Science, Balgrist University Hospital, University of Zurich, Zurich, Switzerland; Laboratory for Orthopaedic Biomechanics, ETH Zurich, Zurich, Switzerland.
Med Image Anal. 2024 Jan;91:103027. doi: 10.1016/j.media.2023.103027. Epub 2023 Nov 10.
Established surgical navigation systems for pedicle screw placement have been proven to be accurate, but still reveal limitations in registration or surgical guidance. Registration of preoperative data to the intraoperative anatomy remains a time-consuming, error-prone task that includes exposure to harmful radiation. Surgical guidance through conventional displays has well-known drawbacks, as information cannot be presented in-situ and from the surgeon's perspective. Consequently, radiation-free and more automatic registration methods with subsequent surgeon-centric navigation feedback are desirable. In this work, we present a marker-less approach that automatically solves the registration problem for lumbar spinal fusion surgery in a radiation-free manner. A deep neural network was trained to segment the lumbar spine and simultaneously predict its orientation, yielding an initial pose for preoperative models, which then is refined for each vertebra individually and updated in real-time with GPU acceleration while handling surgeon occlusions. An intuitive surgical guidance is provided thanks to the integration into an augmented reality based navigation system. The registration method was verified on a public dataset with a median of 100% successful registrations, a median target registration error of 2.7 mm, a median screw trajectory error of 1.6°and a median screw entry point error of 2.3 mm. Additionally, the whole pipeline was validated in an ex-vivo surgery, yielding a 100% screw accuracy and a median target registration error of 1.0 mm. Our results meet clinical demands and emphasize the potential of RGB-D data for fully automatic registration approaches in combination with augmented reality guidance.
已证实用于椎弓根螺钉置入的现有手术导航系统是准确的,但在配准或手术引导方面仍存在局限性。将术前数据与术中解剖结构进行配准仍然是一项耗时且容易出错的任务,其中包括暴露于有害辐射中。通过传统显示器进行手术引导存在众所周知的缺点,因为信息无法在原位且从外科医生的视角呈现。因此,需要无辐射且更自动化的配准方法以及随后以医生为中心的导航反馈。在这项工作中,我们提出了一种无标记方法,该方法以无辐射的方式自动解决腰椎融合手术的配准问题。训练了一个深度神经网络来分割腰椎并同时预测其方向,为术前模型生成初始位姿,然后针对每个椎体单独进行优化,并在处理外科医生遮挡的同时利用GPU加速进行实时更新。由于集成到基于增强现实的导航系统中,提供了直观的手术引导。该配准方法在一个公共数据集上得到验证,成功配准率中位数为100%,目标配准误差中位数为2.7毫米,螺钉轨迹误差中位数为1.6°,螺钉置入点误差中位数为2.3毫米。此外,整个流程在体外手术中得到验证,螺钉准确率为100%,目标配准误差中位数为1.0毫米。我们的结果满足临床需求,并强调了RGB-D数据在结合增强现实引导的全自动配准方法中的潜力。