• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

稳健机器人操作的愿景。

Vision for Robust Robot Manipulation.

机构信息

RoViT, University of Alicante, 03690 San Vicente del Raspeig (Alicante), Spain.

RobInLab, Jaume I University, 12071 Castello de la Plana, Spain.

出版信息

Sensors (Basel). 2019 Apr 6;19(7):1648. doi: 10.3390/s19071648.

DOI:10.3390/s19071648
PMID:30959920
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6480289/
Abstract

Advances in Robotics are leading to a new generation of assistant robots working in ordinary, domestic settings. This evolution raises new challenges in the tasks to be accomplished by the robots. This is the case for object manipulation where the detect-approach-grasp loop requires a robust recovery stage, especially when the held object slides. Several proprioceptive sensors have been developed in the last decades, such as tactile sensors or contact switches, that can be used for that purpose; nevertheless, their implementation may considerably restrict the gripper's flexibility and functionality, increasing their cost and complexity. Alternatively, vision can be used since it is an undoubtedly rich source of information, and in particular, depth vision sensors. We present an approach based on depth cameras to robustly evaluate the manipulation success, continuously reporting about any object loss and, consequently, allowing it to robustly recover from this situation. For that, a Lab-colour segmentation allows the robot to identify potential robot manipulators in the image. Then, the depth information is used to detect any edge resulting from two-object contact. The combination of those techniques allows the robot to accurately detect the presence or absence of contact points between the robot manipulator and a held object. An experimental evaluation in realistic indoor environments supports our approach.

摘要

机器人技术的进步正在引领新一代助理机器人在普通的家庭环境中工作。这种演变给机器人要完成的任务带来了新的挑战。在物体操作方面就是如此,检测-接近-抓取循环需要一个强大的恢复阶段,特别是当被抓住的物体滑动时。在过去几十年中,已经开发了几种本体感受传感器,例如触觉传感器或接触开关,可用于此目的;然而,它们的实施可能会极大地限制夹具的灵活性和功能,增加其成本和复杂性。或者,可以使用视觉,因为它是信息的丰富来源,特别是深度视觉传感器。我们提出了一种基于深度摄像机的方法,以稳健地评估操作的成功,持续报告任何物体的丢失,并因此能够从这种情况中稳健地恢复。为此,Lab 颜色分割允许机器人在图像中识别潜在的机器人操纵器。然后,使用深度信息来检测由于两个物体接触而产生的任何边缘。这些技术的组合使机器人能够准确地检测机器人操纵器和被抓住的物体之间是否存在接触点。在现实的室内环境中的实验评估支持我们的方法。

相似文献

1
Vision for Robust Robot Manipulation.稳健机器人操作的愿景。
Sensors (Basel). 2019 Apr 6;19(7):1648. doi: 10.3390/s19071648.
2
Towards Haptic-Based Dual-Arm Manipulation.迈向基于触觉的双臂操作。
Sensors (Basel). 2022 Dec 29;23(1):376. doi: 10.3390/s23010376.
3
Unknown Object Detection Using a One-Class Support Vector Machine for a Cloud-Robot System.基于单类支持向量机的云机器人系统未知物体检测
Sensors (Basel). 2022 Feb 10;22(4):1352. doi: 10.3390/s22041352.
4
Learning-based control approaches for service robots on cloth manipulation and dressing assistance: a comprehensive review.基于学习的服务机器人布料操作和穿衣辅助控制方法:全面综述。
J Neuroeng Rehabil. 2022 Nov 3;19(1):117. doi: 10.1186/s12984-022-01078-4.
5
Robot Intelligent Grasp of Unknown Objects Based on Multi-Sensor Information.基于多传感器信息的未知物体机器人智能抓取
Sensors (Basel). 2019 Apr 2;19(7):1595. doi: 10.3390/s19071595.
6
A multi-sensorial hybrid control for robotic manipulation in human-robot workspaces.多感觉混合控制在人机协作工作空间中的机器人操作。
Sensors (Basel). 2011;11(10):9839-62. doi: 10.3390/s111009839. Epub 2011 Oct 20.
7
The Synthetic Moth: A Neuromorphic Approach toward Artificial Olfaction in Robots合成蛾:一种用于机器人人工嗅觉的神经形态方法
8
Open core control software for surgical robots.手术机器人的开源核心控制软件。
Int J Comput Assist Radiol Surg. 2010 May;5(3):211-20. doi: 10.1007/s11548-009-0388-9. Epub 2009 Jul 28.
9
Performance evaluation of 3D vision-based semi-autonomous control method for assistive robotic manipulator.基于3D视觉的辅助机器人操纵器半自动控制方法的性能评估
Disabil Rehabil Assist Technol. 2018 Feb;13(2):140-145. doi: 10.1080/17483107.2017.1299804. Epub 2017 Mar 22.
10
Depth-Dependent Control in Vision-Sensor Space for Reconfigurable Parallel Manipulators.可重构并联机器人视觉传感器空间中的深度依赖控制
Sensors (Basel). 2023 Aug 9;23(16):7039. doi: 10.3390/s23167039.

引用本文的文献

1
Artificial Vision Algorithms for Socially Assistive Robot Applications: A Review of the Literature.面向社交辅助机器人应用的人工视觉算法:文献综述。
Sensors (Basel). 2021 Aug 25;21(17):5728. doi: 10.3390/s21175728.
2
Feature Sensing and Robotic Grasping of Objects with Uncertain Information: A Review.具有不确定信息的物体的特征感知和机器人抓取:综述。
Sensors (Basel). 2020 Jul 2;20(13):3707. doi: 10.3390/s20133707.
3
Special Issue on Visual Sensors.视觉传感器专刊

本文引用的文献

1
Learning ambidextrous robot grasping policies.学习双手机器人抓取策略。
Sci Robot. 2019 Jan 16;4(26). doi: 10.1126/scirobotics.aau4984.
2
PHAROS-PHysical Assistant RObot System.PHAROS- 物理助理机器人系统。
Sensors (Basel). 2018 Aug 11;18(8):2633. doi: 10.3390/s18082633.
3
Hand-Object Contact Force Estimation from Markerless Visual Tracking.基于无标记视觉跟踪的手部-物体接触力估计
Sensors (Basel). 2020 Feb 8;20(3):910. doi: 10.3390/s20030910.
4
Potential Energy Distribution of Redundant Cable-Driven Robot Applied to Compliant Grippers: Method and Computational Analysis.应用于柔顺夹具的冗余电缆驱动机器人的势能分布:方法与计算分析
Sensors (Basel). 2019 Aug 2;19(15):3403. doi: 10.3390/s19153403.
IEEE Trans Pattern Anal Mach Intell. 2018 Dec;40(12):2883-2896. doi: 10.1109/TPAMI.2017.2759736. Epub 2017 Oct 26.
4
Deep Learning for Computer Vision: A Brief Review.深度学习在计算机视觉中的应用综述
Comput Intell Neurosci. 2018 Feb 1;2018:7068349. doi: 10.1155/2018/7068349. eCollection 2018.