• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于多传感器信息的未知物体机器人智能抓取

Robot Intelligent Grasp of Unknown Objects Based on Multi-Sensor Information.

作者信息

Ji Shan-Qian, Huang Ming-Bao, Huang Han-Pang

机构信息

Robotics Laboratory, Department of Mechanical Engineering, National Taiwan University, Taipei 10617, Taiwan.

出版信息

Sensors (Basel). 2019 Apr 2;19(7):1595. doi: 10.3390/s19071595.

DOI:10.3390/s19071595
PMID:30986985
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC6480045/
Abstract

Robots frequently need to work in human environments and handle many different types of objects. There are two problems that make this challenging for robots: human environments are typically cluttered, and the multi-finger robot hand needs to grasp and to lift objects without knowing their mass and damping properties. Therefore, this study combined vision and robot hand real-time grasp control action to achieve reliable and accurate object grasping in a cluttered scene. An efficient online algorithm for collision-free grasping pose generation according to a bounding box is proposed, and the grasp pose will be further checked for grasp quality. Finally, by fusing all available sensor data appropriately, an intelligent real-time grasp system was achieved that is reliable enough to handle various objects with unknown weights, friction, and stiffness. The robots used in this paper are the NTU 21-DOF five-finger robot hand and the NTU 6-DOF robot arm, which are both constructed by our Lab.

摘要

机器人经常需要在人类环境中工作并处理许多不同类型的物体。有两个问题使得这对机器人来说具有挑战性:人类环境通常杂乱无章,并且多指机器人手需要在不知道物体质量和阻尼特性的情况下抓取和举起物体。因此,本研究将视觉与机器人手实时抓取控制动作相结合,以在杂乱场景中实现可靠且准确的物体抓取。提出了一种根据边界框生成无碰撞抓取姿态的高效在线算法,并将进一步检查抓取姿态的抓取质量。最后,通过适当地融合所有可用的传感器数据,实现了一个智能实时抓取系统,该系统足够可靠,能够处理具有未知重量、摩擦力和刚度的各种物体。本文中使用的机器人是由我们实验室构建的NTU 21自由度五指机器人手和NTU 6自由度机器人手臂。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/c4ec3996b770/sensors-19-01595-g035.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/5c2422a86003/sensors-19-01595-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/713404655a95/sensors-19-01595-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/20ee9d310020/sensors-19-01595-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/83f1beb2f45a/sensors-19-01595-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/d95b2685ea6c/sensors-19-01595-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/c9124a4906d7/sensors-19-01595-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/dcbfc5a53c8f/sensors-19-01595-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/2a219e834643/sensors-19-01595-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/16d16fd3c7b0/sensors-19-01595-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/68aac0bf4ecd/sensors-19-01595-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/b5f7f2f95f58/sensors-19-01595-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/028ed33f5501/sensors-19-01595-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/c0e0abf26899/sensors-19-01595-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/a98f63287968/sensors-19-01595-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/bf4c52865983/sensors-19-01595-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/1c87059a8081/sensors-19-01595-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/83406a94302d/sensors-19-01595-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/6e1d57595114/sensors-19-01595-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/5ecbd92103b2/sensors-19-01595-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/d93a6578d4af/sensors-19-01595-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/57e2e836cab6/sensors-19-01595-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/99a417cc4d9d/sensors-19-01595-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/85fa71080aa2/sensors-19-01595-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/e5d44bdc9e1e/sensors-19-01595-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/efcb33a8ae4c/sensors-19-01595-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/69d339fd0b73/sensors-19-01595-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/561b8367b006/sensors-19-01595-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/53d7b3a473f1/sensors-19-01595-g029.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/6f073368b97e/sensors-19-01595-g030.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/65d5aeed9e3b/sensors-19-01595-g031.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/0d7926a6f5cb/sensors-19-01595-g033.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/c4ec3996b770/sensors-19-01595-g035.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/5c2422a86003/sensors-19-01595-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/713404655a95/sensors-19-01595-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/20ee9d310020/sensors-19-01595-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/83f1beb2f45a/sensors-19-01595-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/d95b2685ea6c/sensors-19-01595-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/c9124a4906d7/sensors-19-01595-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/dcbfc5a53c8f/sensors-19-01595-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/2a219e834643/sensors-19-01595-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/16d16fd3c7b0/sensors-19-01595-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/68aac0bf4ecd/sensors-19-01595-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/b5f7f2f95f58/sensors-19-01595-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/028ed33f5501/sensors-19-01595-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/c0e0abf26899/sensors-19-01595-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/a98f63287968/sensors-19-01595-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/bf4c52865983/sensors-19-01595-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/1c87059a8081/sensors-19-01595-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/83406a94302d/sensors-19-01595-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/6e1d57595114/sensors-19-01595-g018.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/5ecbd92103b2/sensors-19-01595-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/d93a6578d4af/sensors-19-01595-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/57e2e836cab6/sensors-19-01595-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/99a417cc4d9d/sensors-19-01595-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/85fa71080aa2/sensors-19-01595-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/e5d44bdc9e1e/sensors-19-01595-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/efcb33a8ae4c/sensors-19-01595-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/69d339fd0b73/sensors-19-01595-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/561b8367b006/sensors-19-01595-g028.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/53d7b3a473f1/sensors-19-01595-g029.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/6f073368b97e/sensors-19-01595-g030.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/65d5aeed9e3b/sensors-19-01595-g031.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/0d7926a6f5cb/sensors-19-01595-g033.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4187/6480045/c4ec3996b770/sensors-19-01595-g035.jpg

相似文献

1
Robot Intelligent Grasp of Unknown Objects Based on Multi-Sensor Information.基于多传感器信息的未知物体机器人智能抓取
Sensors (Basel). 2019 Apr 2;19(7):1595. doi: 10.3390/s19071595.
2
3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands.用于机器人手部非刚性物体抓取的3D视觉数据驱动的时空变形
Sensors (Basel). 2016 May 5;16(5):640. doi: 10.3390/s16050640.
3
Learning the signatures of the human grasp using a scalable tactile glove.使用可扩展的触觉手套学习人类抓握的特征。
Nature. 2019 May;569(7758):698-702. doi: 10.1038/s41586-019-1234-z. Epub 2019 May 29.
4
Keypoint-Based Robotic Grasp Detection Scheme in Multi-Object Scenes.基于关键点的多目标场景机器人抓取检测方案。
Sensors (Basel). 2021 Mar 18;21(6):2132. doi: 10.3390/s21062132.
5
Research on Intelligent Robot Point Cloud Grasping in Internet of Things.物联网中智能机器人点云抓取研究
Micromachines (Basel). 2022 Nov 17;13(11):1999. doi: 10.3390/mi13111999.
6
Monocular-Based 6-Degree of Freedom Pose Estimation Technology for Robotic Intelligent Grasping Systems.用于机器人智能抓取系统的基于单目视觉的六自由度姿态估计技术
Sensors (Basel). 2017 Feb 14;17(2):334. doi: 10.3390/s17020334.
7
Design and Experimental Research of Robot Finger Sliding Tactile Sensor Based on FBG.基于光纤布拉格光栅的机器人手指滑动触觉传感器的设计与实验研究。
Sensors (Basel). 2022 Nov 1;22(21):8390. doi: 10.3390/s22218390.
8
A Comprehensive Study of 3-D Vision-Based Robot Manipulation.基于三维视觉的机器人操作综合研究
IEEE Trans Cybern. 2023 Mar;53(3):1682-1698. doi: 10.1109/TCYB.2021.3108165. Epub 2023 Feb 15.
9
Human Grasp Mechanism Understanding, Human-Inspired Grasp Control and Robotic Grasping Planning for Agricultural Robots.农业机器人的人类抓握机制理解、受人类启发的抓握控制和机器人抓握规划。
Sensors (Basel). 2022 Jul 13;22(14):5240. doi: 10.3390/s22145240.
10
Grasp Stability Prediction for a Dexterous Robotic Hand Combining Depth Vision and Haptic Bayesian Exploration.结合深度视觉与触觉贝叶斯探索的灵巧机器人手抓握稳定性预测
Front Robot AI. 2021 Aug 12;8:703869. doi: 10.3389/frobt.2021.703869. eCollection 2021.

引用本文的文献

1
Eye-in-Hand Robotic Arm Gripping System Based on Machine Learning and State Delay Optimization.基于机器学习和状态时滞优化的机器手眼抓取系统。
Sensors (Basel). 2023 Jan 17;23(3):1076. doi: 10.3390/s23031076.
2
Object Manipulation with an Anthropomorphic Robotic Hand via Deep Reinforcement Learning with a Synergy Space of Natural Hand Poses.基于自然手位协同空间的深度强化学习的拟人机器人手操作
Sensors (Basel). 2021 Aug 5;21(16):5301. doi: 10.3390/s21165301.
3
Tactile Sensors for Robotic Applications.用于机器人应用的触觉传感器。

本文引用的文献

1
Learn the Lagrangian: A Vector-Valued RKHS Approach to Identifying Lagrangian Systems.学习拉格朗日:一种基于向量值 RKHS 的拉格朗日系统辨识方法。
IEEE Trans Cybern. 2016 Dec;46(12):3247-3258. doi: 10.1109/TCYB.2015.2501842. Epub 2015 Dec 8.
2
Learning the Inverse Dynamics of Robotic Manipulators in Structured Reproducing Kernel Hilbert Space.在结构化再生核希尔伯特空间中学习机器人操作器的逆动力学。
IEEE Trans Cybern. 2016 Jul;46(7):1691-703. doi: 10.1109/TCYB.2015.2454334. Epub 2015 Aug 26.
3
Grasp quality measures: review and performance.
Sensors (Basel). 2020 Dec 8;20(24):7009. doi: 10.3390/s20247009.
4
GadgetArm-Automatic Grasp Generation and Manipulation of 4-DOF Robot Arm for Arbitrary Objects Through Reinforcement Learning.GadgetArm:通过强化学习实现四自由度机器人手臂对任意物体的自动抓取生成与操作
Sensors (Basel). 2020 Oct 30;20(21):6183. doi: 10.3390/s20216183.
5
Feature Sensing and Robotic Grasping of Objects with Uncertain Information: A Review.具有不确定信息的物体的特征感知和机器人抓取:综述。
Sensors (Basel). 2020 Jul 2;20(13):3707. doi: 10.3390/s20133707.
6
Laser Ranging-Assisted Binocular Visual Sensor Tracking System.激光测距辅助双目视觉传感器跟踪系统
Sensors (Basel). 2020 Jan 27;20(3):688. doi: 10.3390/s20030688.
掌握质量指标:审查与绩效。
Auton Robots. 2015;38(1):65-88. doi: 10.1007/s10514-014-9402-3. Epub 2014 Jul 31.
4
Roles of glabrous skin receptors and sensorimotor memory in automatic control of precision grip when lifting rougher or more slippery objects.无毛皮肤感受器和感觉运动记忆在拿起更粗糙或更滑的物体时对精确抓握自动控制中的作用。
Exp Brain Res. 1984;56(3):550-64. doi: 10.1007/BF00237997.