• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过自主器械跟踪实现机器人内窥镜控制。

Robotic Endoscope Control Via Autonomous Instrument Tracking.

作者信息

Gruijthuijsen Caspar, Garcia-Peraza-Herrera Luis C, Borghesan Gianni, Reynaerts Dominiek, Deprest Jan, Ourselin Sebastien, Vercauteren Tom, Vander Poorten Emmanuel

机构信息

Department of Mechanical Engineering, KU Leuven, Leuven, Belgium.

Department of Medical Physics and Biomedical Engineering, University College London, London, United Kingdom.

出版信息

Front Robot AI. 2022 Apr 11;9:832208. doi: 10.3389/frobt.2022.832208. eCollection 2022.

DOI:10.3389/frobt.2022.832208
PMID:35480090
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9035496/
Abstract

Many keyhole interventions rely on bi-manual handling of surgical instruments, forcing the main surgeon to rely on a second surgeon to act as a camera assistant. In addition to the burden of excessively involving surgical staff, this may lead to reduced image stability, increased task completion time and sometimes errors due to the monotony of the task. Robotic endoscope holders, controlled by a set of basic instructions, have been proposed as an alternative, but their unnatural handling may increase the cognitive load of the (solo) surgeon, which hinders their clinical acceptance. More seamless integration in the surgical workflow would be achieved if robotic endoscope holders collaborated with the operating surgeon semantically rich instructions that closely resemble instructions that would otherwise be issued to a human camera assistant, such as "focus on my right-hand instrument." As a proof of concept, this paper presents a novel system that paves the way towards a synergistic interaction between surgeons and robotic endoscope holders. The proposed platform allows the surgeon to perform a bimanual coordination and navigation task, while a robotic arm autonomously performs the endoscope positioning tasks. Within our system, we propose a novel tooltip localization method based on surgical tool segmentation and a novel visual servoing approach that ensures smooth and appropriate motion of the endoscope camera. We validate our vision pipeline and run a user study of this system. The clinical relevance of the study is ensured through the use of a laparoscopic exercise validated by the European Academy of Gynaecological Surgery which involves bi-manual coordination and navigation. Successful application of our proposed system provides a promising starting point towards broader clinical adoption of robotic endoscope holders.

摘要

许多微创手术依赖于外科手术器械的双手操作,这迫使主刀医生依赖另一位医生作为摄像助手。除了过度占用手术人员的负担外,这可能会导致图像稳定性降低、任务完成时间增加,有时还会因任务单调而出现错误。有人提出使用由一组基本指令控制的机器人内窥镜固定器作为替代方案,但它们不自然的操作可能会增加(单人)外科医生的认知负担,从而阻碍其临床应用。如果机器人内窥镜固定器能够与手术医生协作,以类似于原本会发给人类摄像助手的语义丰富的指令(例如“聚焦于我的右手器械”)进行操作,那么在手术工作流程中就能实现更无缝的集成。作为概念验证,本文提出了一种新颖的系统,为外科医生与机器人内窥镜固定器之间的协同交互铺平了道路。所提出的平台允许外科医生执行双手协调和导航任务,而机器人手臂则自主执行内窥镜定位任务。在我们的系统中,我们提出了一种基于手术工具分割的新颖工具提示定位方法和一种新颖的视觉伺服方法,以确保内窥镜摄像头的平稳且适当的运动。我们验证了我们的视觉管道并对该系统进行了用户研究。通过使用由欧洲妇科外科学院验证的涉及双手协调和导航的腹腔镜练习,确保了该研究的临床相关性。我们提出的系统的成功应用为机器人内窥镜固定器在更广泛的临床应用中提供了一个有希望的起点。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/8489ef813125/frobt-09-832208-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/1ce25e0e2bf2/frobt-09-832208-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/5bc0c3249614/frobt-09-832208-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/6b361a857f85/frobt-09-832208-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/cd2709c24d7e/frobt-09-832208-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/14fee04c10e4/frobt-09-832208-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/3bdfe108f5af/frobt-09-832208-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/15df832447fe/frobt-09-832208-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/e1cf40da34f6/frobt-09-832208-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/3f55f40c4f0f/frobt-09-832208-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/77f2c12b2238/frobt-09-832208-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/50a674a0d43e/frobt-09-832208-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/5b5b6e9fc030/frobt-09-832208-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/91629f13d40c/frobt-09-832208-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/2ae8281b130e/frobt-09-832208-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/6220bb750109/frobt-09-832208-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/8489ef813125/frobt-09-832208-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/1ce25e0e2bf2/frobt-09-832208-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/5bc0c3249614/frobt-09-832208-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/6b361a857f85/frobt-09-832208-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/cd2709c24d7e/frobt-09-832208-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/14fee04c10e4/frobt-09-832208-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/3bdfe108f5af/frobt-09-832208-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/15df832447fe/frobt-09-832208-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/e1cf40da34f6/frobt-09-832208-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/3f55f40c4f0f/frobt-09-832208-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/77f2c12b2238/frobt-09-832208-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/50a674a0d43e/frobt-09-832208-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/5b5b6e9fc030/frobt-09-832208-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/91629f13d40c/frobt-09-832208-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/2ae8281b130e/frobt-09-832208-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/6220bb750109/frobt-09-832208-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6f8f/9035496/8489ef813125/frobt-09-832208-g016.jpg

相似文献

1
Robotic Endoscope Control Via Autonomous Instrument Tracking.通过自主器械跟踪实现机器人内窥镜控制。
Front Robot AI. 2022 Apr 11;9:832208. doi: 10.3389/frobt.2022.832208. eCollection 2022.
2
A hands-free region-of-interest selection interface for solo surgery with a wide-angle endoscope: preclinical proof of concept.用于广角内窥镜单人手术的免提感兴趣区域选择界面:概念验证的临床前研究
Surg Endosc. 2017 Feb;31(2):974-980. doi: 10.1007/s00464-016-5013-8. Epub 2016 Aug 8.
3
Robotic arm enhancement to accommodate improved efficiency and decreased resource utilization in complex minimally invasive surgical procedures.机器人手臂增强功能,以适应复杂的微创手术中提高效率和降低资源利用的需求。
Stud Health Technol Inform. 1996;29:471-81.
4
Robotic-assisted minimally invasive surgery for gynecologic and urologic oncology: an evidence-based analysis.机器人辅助微创手术在妇科和泌尿外科肿瘤学中的应用:一项基于证据的分析。
Ont Health Technol Assess Ser. 2010;10(27):1-118. Epub 2010 Dec 1.
5
Camera and instrument holders and their clinical value in minimally invasive surgery.摄像头及器械固定装置及其在微创手术中的临床价值。
Surg Laparosc Endosc Percutan Tech. 2004 Jun;14(3):145-52. doi: 10.1097/01.sle.0000129395.42501.5d.
6
Automating Endoscope Motion in Robotic Surgery: A Usability Study on da Vinci-Assisted Neobladder Reconstruction.机器人手术中内窥镜运动的自动化:达芬奇辅助新膀胱重建的可用性研究
Front Robot AI. 2021 Nov 25;8:707704. doi: 10.3389/frobt.2021.707704. eCollection 2021.
7
Experimental trial on solo surgery for minimally invasive therapy: comparison of different systems in a phantom model.微创治疗单人手术的实验性试验:在体模模型中对不同系统的比较
Surg Endosc. 2000 Oct;14(10):955-9. doi: 10.1007/s004640000106.
8
Homography-based Visual Servoing with Remote Center of Motion for Semi-autonomous Robotic Endoscope Manipulation.基于单应性的视觉伺服与运动远程中心用于半自主机器人内窥镜操作
Int Symp Med Robot. 2021 Nov 17;220:1-7. doi: 10.1109/ISMR48346.2021.9661563.
9
Gaze gesture based human robot interaction for laparoscopic surgery.基于注视手势的腹腔镜手术人机交互。
Med Image Anal. 2018 Feb;44:196-214. doi: 10.1016/j.media.2017.11.011. Epub 2017 Nov 28.
10
Robotic Endoscope Control - State of the Art of Voice Control and Other Options for Laparoscopic Camera Robot Guidance.机器人内窥镜控制 - 语音控制和其他腹腔镜摄像机器人引导选项的最新技术。
Surg Technol Int. 2022 May 19;40:17-24. doi: 10.52198/22.STI.40.SO1545.

引用本文的文献

1
Rapid and robust endoscopic content area estimation: A lean GPU-based pipeline and curated benchmark dataset.快速且强大的内镜内容区域估计:基于精简GPU的流程及精选基准数据集。
Comput Methods Biomech Biomed Eng Imaging Vis. 2023 Jul 4;11(4):1215-1224. doi: 10.1080/21681163.2022.2156393. Epub 2023 Jan 4.
2
Endoscope Automation Framework with Hierarchical Control and Interactive Perception for Multi-Tool Tracking in Minimally Invasive Surgery.内窥镜自动化框架,具有分层控制和交互式感知,用于微创手术中的多工具跟踪。
Sensors (Basel). 2023 Dec 16;23(24):9865. doi: 10.3390/s23249865.
3
Artificial intelligence and automation in endoscopy and surgery.

本文引用的文献

1
Image-based laparoscopic camera steering versus conventional steering: a comparison study.基于图像的腹腔镜摄像转向与传统转向:对比研究。
J Robot Surg. 2022 Oct;16(5):1157-1163. doi: 10.1007/s11701-021-01342-0. Epub 2022 Jan 21.
2
Automating Endoscope Motion in Robotic Surgery: A Usability Study on da Vinci-Assisted Neobladder Reconstruction.机器人手术中内窥镜运动的自动化:达芬奇辅助新膀胱重建的可用性研究
Front Robot AI. 2021 Nov 25;8:707704. doi: 10.3389/frobt.2021.707704. eCollection 2021.
3
A learning robot for cognitive camera control in minimally invasive surgery.
内镜检查与手术中的人工智能和自动化
Nat Rev Gastroenterol Hepatol. 2023 Mar;20(3):171-182. doi: 10.1038/s41575-022-00701-y. Epub 2022 Nov 9.
用于微创手术中认知相机控制的学习机器人。
Surg Endosc. 2021 Sep;35(9):5365-5374. doi: 10.1007/s00464-021-08509-8. Epub 2021 Apr 27.
4
Comparative validation of multi-instance instrument segmentation in endoscopy: Results of the ROBUST-MIS 2019 challenge.多实例仪器分割在内窥镜中的比较验证:2019 年 ROBUST-MIS 挑战赛的结果。
Med Image Anal. 2021 May;70:101920. doi: 10.1016/j.media.2020.101920. Epub 2020 Nov 28.
5
Image Compositing for Segmentation of Surgical Tools Without Manual Annotations.无需人工标注的手术工具分割图像合成
IEEE Trans Med Imaging. 2021 May;40(5):1450-1460. doi: 10.1109/TMI.2021.3057884. Epub 2021 Apr 30.
6
Medical robotics-Regulatory, ethical, and legal considerations for increasing levels of autonomy.医疗机器人——自主程度不断提高的监管、伦理和法律考虑因素。
Sci Robot. 2017 Mar 15;2(4). doi: 10.1126/scirobotics.aam8638.
7
Evaluation of a remote-controlled laparoscopic camera holder for basic laparoscopic skills acquisition: a randomized controlled trial.一种用于基础腹腔镜技能获取的遥控腹腔镜摄像臂的评估:一项随机对照试验。
Surg Endosc. 2021 Aug;35(8):4183-4191. doi: 10.1007/s00464-020-07899-5. Epub 2020 Aug 26.
8
Ergonomics Analysis for Subjective and Objective Fatigue Between Laparoscopic and Robotic Surgical Skills Practice Among Surgeons.外科医生腹腔镜手术技能与机器人手术技能练习中主观和客观疲劳的人体工程学分析
Surg Innov. 2020 Feb;27(1):81-87. doi: 10.1177/1553350619887861. Epub 2019 Nov 27.
9
A partial augmented reality system with live ultrasound and registered preoperative MRI for guiding robot-assisted radical prostatectomy.带实时超声和配准术前 MRI 的部分增强现实系统,用于引导机器人辅助根治性前列腺切除术。
Med Image Anal. 2020 Feb;60:101588. doi: 10.1016/j.media.2019.101588. Epub 2019 Oct 29.
10
Dense Depth Estimation in Monocular Endoscopy With Self-Supervised Learning Methods.基于自监督学习方法的单目内窥镜下密集深度估计。
IEEE Trans Med Imaging. 2020 May;39(5):1438-1447. doi: 10.1109/TMI.2019.2950936. Epub 2019 Nov 1.