Computer Aided Medical Procedures, Johns Hopkins University, Baltimore, USA.
Department of Computer Science, Johns Hopkins University, Baltimore, USA.
Int J Comput Assist Radiol Surg. 2019 Sep;14(9):1553-1563. doi: 10.1007/s11548-019-02035-8. Epub 2019 Jul 26.
Image-guided percutaneous interventions are safer alternatives to conventional orthopedic and trauma surgeries. To advance surgical tools in complex bony structures during these procedures with confidence, a large number of images is acquired. While image-guidance is the de facto standard to guarantee acceptable outcome, when these images are presented on monitors far from the surgical site the information content cannot be associated easily with the 3D patient anatomy.
In this article, we propose a collaborative augmented reality (AR) surgical ecosystem to jointly co-localize the C-arm X-ray and surgeon viewer. The technical contributions of this work include (1) joint calibration of a visual tracker on a C-arm scanner and its X-ray source via a hand-eye calibration strategy, and (2) inside-out co-localization of human and X-ray observers in shared tracking and augmentation environments using vision-based simultaneous localization and mapping.
We present a thorough evaluation of the hand-eye calibration procedure. Results suggest convergence when using 50 pose pairs or more. The mean translation and rotation errors at convergence are 5.7 mm and [Formula: see text], respectively. Further, user-in-the-loop studies were conducted to estimate the end-to-end target augmentation error. The mean distance between landmarks in real and virtual environment was 10.8 mm.
The proposed AR solution provides a shared augmented experience between the human and X-ray viewer. The collaborative surgical AR system has the potential to simplify hand-eye coordination for surgeons or intuitively inform C-arm technologists for prospective X-ray view-point planning.
影像引导的经皮介入是传统骨科和创伤手术的更安全替代方法。为了在这些手术中自信地推进复杂骨结构中的手术工具,需要获取大量图像。虽然影像引导是保证可接受结果的事实上的标准,但当这些图像在远离手术部位的监视器上呈现时,信息内容不容易与 3D 患者解剖结构相关联。
在本文中,我们提出了一种协作式增强现实(AR)手术生态系统,以共同定位 C 臂 X 射线和外科医生观察器。这项工作的技术贡献包括:(1)通过手眼校准策略,对 C 臂扫描仪及其 X 射线源上的视觉跟踪器进行联合校准;(2)使用基于视觉的同时定位和映射,在共享跟踪和增强环境中,对人和 X 射线观察者进行内部协同定位。
我们对该手眼校准程序进行了全面评估。结果表明,当使用 50 对或更多姿势对时可以实现收敛。收敛时的平均平移和旋转误差分别为 5.7mm 和[公式:见文本]。此外,还进行了用户在环研究,以估计目标增强的端到端误差。真实环境和虚拟环境中地标之间的平均距离为 10.8mm。
所提出的 AR 解决方案为人类和 X 射线观察者提供了共享的增强体验。协作式手术 AR 系统具有简化外科医生手眼协调的潜力,或者直观地为前瞻性 X 射线视点规划提供 C 臂技师的信息。