• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于金属目标机器人任务干预的单目稳健深度估计视觉系统。

Monocular Robust Depth Estimation Vision System for Robotic Tasks Interventions in Metallic Targets.

作者信息

Veiga Almagro Carlos, Di Castro Mario, Lunghi Giacomo, Marín Prades Raúl, Sanz Valero Pedro José, Pérez Manuel Ferre, Masi Alessandro

机构信息

CERN, EN-SMM Survey, Measurement and Mechatronics group, 1217 Geneva, Switzerland.

Centro de Automatica y Robotica (CAR) UPM-CSIC, Universidad Politecnica de Madrid, 28006 Madrid, Spain.

出版信息

Sensors (Basel). 2019 Jul 22;19(14):3220. doi: 10.3390/s19143220.

DOI:10.3390/s19143220
PMID:31336628
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6679509/
Abstract

Robotic interventions in hazardous scenarios need to pay special attention to safety, as in most cases it is necessary to have an expert operator in the loop. Moreover, the use of a multi-modal Human-Robot Interface allows the user to interact with the robot using manual control in critical steps, as well as semi-autonomous behaviours in more secure scenarios, by using, for example, object tracking and recognition techniques. This paper describes a novel vision system to track and estimate the depth of metallic targets for robotic interventions. The system has been designed for on-hand monocular cameras, focusing on solving lack of visibility and partial occlusions. This solution has been validated during real interventions at the Centre for Nuclear Research (CERN) accelerator facilities, achieving 95% success in autonomous mode and 100% in a supervised manner. The system increases the safety and efficiency of the robotic operations, reducing the cognitive fatigue of the operator during non-critical mission phases. The integration of such an assistance system is especially important when facing complex (or repetitive) tasks, in order to reduce the work load and accumulated stress of the operator, enhancing the performance and safety of the mission.

摘要

在危险场景中的机器人干预需要特别关注安全性,因为在大多数情况下,有专家操作员参与其中是必要的。此外,使用多模态人机界面允许用户在关键步骤中通过手动控制与机器人交互,以及在更安全的场景中使用例如目标跟踪和识别技术等进行半自主行为。本文描述了一种用于机器人干预的新型视觉系统,用于跟踪和估计金属目标的深度。该系统是为手持单目相机设计的,专注于解决能见度不足和部分遮挡的问题。该解决方案已在欧洲核子研究中心(CERN)加速器设施的实际干预中得到验证,在自主模式下成功率达到95%,在有监督的情况下达到100%。该系统提高了机器人操作的安全性和效率,减少了非关键任务阶段操作员的认知疲劳。当面对复杂(或重复)任务时,集成这样的辅助系统尤为重要,以便减轻操作员的工作量和累积压力,提高任务的性能和安全性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/0681df1bb4a0/sensors-19-03220-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/5917114938fc/sensors-19-03220-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/8b24650ee2af/sensors-19-03220-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/be4ada3edecf/sensors-19-03220-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/4ef4330851f0/sensors-19-03220-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/49eefbcfa913/sensors-19-03220-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/00d4fbe2486f/sensors-19-03220-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/5ce413139b8a/sensors-19-03220-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/7bd04035b1d6/sensors-19-03220-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/191a95ff5c2f/sensors-19-03220-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/f56a15c495fa/sensors-19-03220-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/2fac4e255ece/sensors-19-03220-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/3c237aaeec2a/sensors-19-03220-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/77955c281580/sensors-19-03220-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/48de6d8ac44e/sensors-19-03220-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/acd384028b6c/sensors-19-03220-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/e5366f7f8495/sensors-19-03220-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/48d7d087ebfb/sensors-19-03220-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/3095d414690c/sensors-19-03220-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/43aa130f0741/sensors-19-03220-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/a12a5d3fe720/sensors-19-03220-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/02d3c175e4d7/sensors-19-03220-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/0681df1bb4a0/sensors-19-03220-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/5917114938fc/sensors-19-03220-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/8b24650ee2af/sensors-19-03220-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/be4ada3edecf/sensors-19-03220-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/4ef4330851f0/sensors-19-03220-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/49eefbcfa913/sensors-19-03220-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/00d4fbe2486f/sensors-19-03220-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/5ce413139b8a/sensors-19-03220-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/7bd04035b1d6/sensors-19-03220-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/191a95ff5c2f/sensors-19-03220-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/f56a15c495fa/sensors-19-03220-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/2fac4e255ece/sensors-19-03220-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/3c237aaeec2a/sensors-19-03220-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/77955c281580/sensors-19-03220-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/48de6d8ac44e/sensors-19-03220-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/acd384028b6c/sensors-19-03220-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/e5366f7f8495/sensors-19-03220-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/48d7d087ebfb/sensors-19-03220-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/3095d414690c/sensors-19-03220-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/43aa130f0741/sensors-19-03220-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/a12a5d3fe720/sensors-19-03220-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/02d3c175e4d7/sensors-19-03220-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7579/6679509/0681df1bb4a0/sensors-19-03220-g022.jpg

相似文献

1
Monocular Robust Depth Estimation Vision System for Robotic Tasks Interventions in Metallic Targets.用于金属目标机器人任务干预的单目稳健深度估计视觉系统。
Sensors (Basel). 2019 Jul 22;19(14):3220. doi: 10.3390/s19143220.
2
Performance evaluation of 3D vision-based semi-autonomous control method for assistive robotic manipulator.基于3D视觉的辅助机器人操纵器半自动控制方法的性能评估
Disabil Rehabil Assist Technol. 2018 Feb;13(2):140-145. doi: 10.1080/17483107.2017.1299804. Epub 2017 Mar 22.
3
(MARGOT) Monocular Camera-Based Robot Grasping Strategy for Metallic Objects.基于单目相机的金属物体机器人抓取策略。
Sensors (Basel). 2023 Jun 5;23(11):5344. doi: 10.3390/s23115344.
4
Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping.脑机接口与视觉引导自主机器人技术的融合可提高抓握过程中神经假肢手臂的性能。
J Neuroeng Rehabil. 2016 Mar 18;13:28. doi: 10.1186/s12984-016-0134-9.
5
A Confidence-Based Shared Control Strategy for the Smart Tissue Autonomous Robot (STAR).一种基于置信度的智能组织自主机器人(STAR)共享控制策略。
Rep U S. 2018 Oct;2018:1268-1275. doi: 10.1109/IROS.2018.8594290. Epub 2019 Jan 7.
6
Design and Operational Elements of the Robotic Subsystem for the e.deorbit Debris Removal Mission.用于e.脱轨碎片清除任务的机器人子系统的设计与操作要素
Front Robot AI. 2018 Aug 31;5:100. doi: 10.3389/frobt.2018.00100. eCollection 2018.
7
Usability testing of a developed assistive robotic system with virtual assistance for individuals with cerebral palsy: a case study.针对脑瘫患者的具有虚拟辅助功能的已开发辅助机器人系统的可用性测试:一项案例研究。
Disabil Rehabil Assist Technol. 2018 Aug;13(6):517-522. doi: 10.1080/17483107.2017.1344884. Epub 2017 Jul 4.
8
A Confidence-Based Supervised-Autonomous Control Strategy for Robotic Vaginal Cuff Closure.一种基于置信度的机器人阴道袖口闭合监督自主控制策略。
IEEE Int Conf Robot Autom. 2021 May-Jun;2021. doi: 10.1109/icra48506.2021.9561685. Epub 2021 Oct 18.
9
3D Visual Tracking of an Articulated Robot in Precision Automated Tasks.精密自动化任务中关节机器人的三维视觉跟踪
Sensors (Basel). 2017 Jan 7;17(1):104. doi: 10.3390/s17010104.
10
Real-Time Fruit Recognition and Grasping Estimation for Robotic Apple Harvesting.实时水果识别与机器人采摘苹果的抓取估计。
Sensors (Basel). 2020 Oct 4;20(19):5670. doi: 10.3390/s20195670.

引用本文的文献

1
(MARGOT) Monocular Camera-Based Robot Grasping Strategy for Metallic Objects.基于单目相机的金属物体机器人抓取策略。
Sensors (Basel). 2023 Jun 5;23(11):5344. doi: 10.3390/s23115344.
2
Manipulation Tasks in Hazardous Environments Using a Teleoperated Robot: A Case Study at CERN.使用遥操作机器人在危险环境中执行操作任务:在 CERN 的案例研究。
Sensors (Basel). 2023 Feb 10;23(4):1979. doi: 10.3390/s23041979.
3
MiniCERNBot Educational Platform: Antimatter Factory Mock-up Missions for Problem-Solving STEM Learning.迷你欧洲核子研究中心机器人教育平台:用于解决问题的STEM学习的反物质工厂模型任务。

本文引用的文献

1
Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks.更快的 R-CNN:基于区域建议网络的实时目标检测。
IEEE Trans Pattern Anal Mach Intell. 2017 Jun;39(6):1137-1149. doi: 10.1109/TPAMI.2016.2577031. Epub 2016 Jun 6.
2
Struck: Structured Output Tracking with Kernels.击中:基于核的结构化输出跟踪。
IEEE Trans Pattern Anal Mach Intell. 2016 Oct;38(10):2096-109. doi: 10.1109/TPAMI.2015.2509974. Epub 2015 Dec 17.
3
High-Speed Tracking with Kernelized Correlation Filters.基于核相关滤波器的高速跟踪。
Sensors (Basel). 2021 Feb 17;21(4):1398. doi: 10.3390/s21041398.
4
Multi-Scale Spatio-Temporal Feature Extraction and Depth Estimation from Sequences by Ordinal Classification.基于序分类的序列多尺度时空特征提取与深度估计。
Sensors (Basel). 2020 Apr 1;20(7):1979. doi: 10.3390/s20071979.
IEEE Trans Pattern Anal Mach Intell. 2015 Mar;37(3):583-96. doi: 10.1109/TPAMI.2014.2345390.
4
Robust Object Tracking with Online Multiple Instance Learning.基于在线多示例学习的鲁棒目标跟踪。
IEEE Trans Pattern Anal Mach Intell. 2011 Aug;33(8):1619-32. doi: 10.1109/TPAMI.2010.226. Epub 2010 Dec 23.