Sheikh Zayed Institute for Pediatric Surgical Innovation, Children's National Hospital, Washington, DC, USA.
IGI Technologies, Inc., College Park, MD, USA.
Int J Comput Assist Radiol Surg. 2020 May;15(5):803-810. doi: 10.1007/s11548-020-02164-5. Epub 2020 Apr 22.
For laparoscopic ablation to be successful, accurate placement of the needle to the tumor is essential. Laparoscopic ultrasound is an essential tool to guide needle placement, but the ultrasound image is generally presented separately from the laparoscopic image. We aim to evaluate an augmented reality (AR) system which combines laparoscopic ultrasound image, laparoscope video, and the needle trajectory in a unified view.
We created a tissue phantom made of gelatin. Artificial tumors represented by plastic spheres were secured in the gelatin at various depths. The top point of the sphere surface was our target, and its 3D coordinates were known. The participants were invited to perform needle placement with and without AR guidance. Once the participant reported that the needle tip had reached the target, the needle tip location was recorded and compared to the ground truth location of the target, and the difference was the target localization error (TLE). The time of the needle placement was also recorded. We further tested the technical feasibility of the AR system in vivo on a 40-kg swine.
The AR guidance system was evaluated by two experienced surgeons and two surgical fellows. The users performed needle placement on a total of 26 targets, 13 with AR and 13 without (i.e., the conventional approach). The average TLE for the conventional and the AR approaches was 14.9 mm and 11.1 mm, respectively. The average needle placement time needed for the conventional and AR approaches was 59.4 s and 22.9 s, respectively. For the animal study, ultrasound image and needle trajectory were successfully fused with the laparoscopic video in real time and presented on a single screen for the surgeons.
By providing projected needle trajectory, we believe our AR system can assist the surgeon with more efficient and precise needle placement.
为了使腹腔镜消融成功,准确地将针放置到肿瘤部位是至关重要的。腹腔镜超声是指导针放置的重要工具,但超声图像通常与腹腔镜图像分开呈现。我们旨在评估一种增强现实(AR)系统,该系统将腹腔镜超声图像、腹腔镜视频和针轨迹结合在一个统一的视图中。
我们创建了一个由明胶制成的组织模型。将塑料球体代表的人工肿瘤固定在明胶中的不同深度处。球体表面的最高点是我们的目标,其三维坐标是已知的。邀请参与者在有和没有 AR 引导的情况下进行针放置。一旦参与者报告针尖已到达目标,就记录针尖的位置,并将其与目标的实际位置进行比较,两者的差值即为目标定位误差(TLE)。还记录了针放置的时间。我们还在一只 40 公斤的猪身上进一步测试了 AR 系统的技术可行性。
该 AR 引导系统由两名经验丰富的外科医生和两名外科住院医师进行了评估。用户总共对 26 个目标进行了针放置,其中 13 个使用 AR,13 个不使用(即传统方法)。传统方法和 AR 方法的平均 TLE 分别为 14.9 毫米和 11.1 毫米。传统方法和 AR 方法的平均针放置时间分别为 59.4 秒和 22.9 秒。对于动物研究,超声图像和针轨迹成功地实时融合到腹腔镜视频中,并在单个屏幕上呈现给外科医生。
通过提供投影针轨迹,我们相信我们的 AR 系统可以帮助外科医生更高效、更精确地进行针放置。