• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于机器学习和状态时滞优化的机器手眼抓取系统。

Eye-in-Hand Robotic Arm Gripping System Based on Machine Learning and State Delay Optimization.

机构信息

Graduate Institute of Automation Technology, National Taipei University of Technology, Taipei 10608, Taiwan.

出版信息

Sensors (Basel). 2023 Jan 17;23(3):1076. doi: 10.3390/s23031076.

DOI:10.3390/s23031076
PMID:36772116
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9919884/
Abstract

This research focused on using RGB-D images and modifying an existing machine learning network architecture to generate predictions of the location of successfully grasped objects and to optimize the control system for state delays. A five-finger gripper designed to mimic the human palm was tested to demonstrate that it can perform more delicate missions than many two- or three-finger grippers. Experiments were conducted using the 6-DOF robot arm with the five-finger and two-finger grippers to perform at least 100 actual machine grasps, and compared to the results of other studies. Additionally, we investigated state time delays and proposed a control method for a robot manipulator. Many studies on time-delay systems have been conducted, but most focus on input and output delays. One reason for this emphasis is that input and output delays are the most commonly occurring delays in physical or electronic systems. An additional reason is that state delays increase the complexity of the overall control system. Finally, it was demonstrated that our network can perform as well as a deep network architecture with little training data and omitting steps, such as posture evaluation, and when combined with the hardware advantages of the five-finger gripper, it can produce an automated system with a gripping success rate of over 90%. This paper is an extended study of the conference paper.

摘要

本研究专注于使用 RGB-D 图像和修改现有的机器学习网络架构来生成成功抓取物体位置的预测,并优化用于状态延迟的控制系统。设计了一个五指夹持器来模拟人类手掌,以证明它可以执行比许多两指或三指夹持器更精细的任务。使用具有五指和两指夹持器的 6-DOF 机器人臂进行了实验,以进行至少 100 次实际机器抓取,并与其他研究的结果进行了比较。此外,我们研究了状态时滞并提出了一种机器人操纵器的控制方法。已经进行了许多关于时滞系统的研究,但大多数研究都集中在输入和输出延迟上。强调这一点的一个原因是输入和输出延迟是物理或电子系统中最常见的延迟。另一个原因是状态延迟增加了整个控制系统的复杂性。最后,结果表明,我们的网络可以在很少的训练数据和省略步骤(如姿势评估)的情况下与深度网络架构一样表现良好,并且当与五指夹持器的硬件优势结合使用时,它可以产生一个自动化系统,其抓取成功率超过 90%。本文是会议论文的扩展研究。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/c6a4c17ee571/sensors-23-01076-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/31d49384c080/sensors-23-01076-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/73945298ac18/sensors-23-01076-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/031795cdcfab/sensors-23-01076-g018a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/b8009fb5244d/sensors-23-01076-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/5f44d3389616/sensors-23-01076-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/123d139dc66e/sensors-23-01076-g021a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/c3fabb360347/sensors-23-01076-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/c21674dc55c0/sensors-23-01076-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/c5dae4334f89/sensors-23-01076-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/8a8a19cf1aff/sensors-23-01076-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/d48da0226f34/sensors-23-01076-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/214b652158f4/sensors-23-01076-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/bd581a9f2e08/sensors-23-01076-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/9d52fdbd7b0d/sensors-23-01076-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/2c5b490ed07c/sensors-23-01076-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/9392d1bf1a9a/sensors-23-01076-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/3f53c421e499/sensors-23-01076-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/7b5aabf014ca/sensors-23-01076-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/92e742441b83/sensors-23-01076-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/58676bc4cebb/sensors-23-01076-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/2ed3a1d2f61e/sensors-23-01076-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/cd704cfd5192/sensors-23-01076-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/2a3f47c926ef/sensors-23-01076-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/b798e59e23da/sensors-23-01076-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/d7d76533e4df/sensors-23-01076-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/b4f08edcaced/sensors-23-01076-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/c6a4c17ee571/sensors-23-01076-g027.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/31d49384c080/sensors-23-01076-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/73945298ac18/sensors-23-01076-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/031795cdcfab/sensors-23-01076-g018a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/b8009fb5244d/sensors-23-01076-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/5f44d3389616/sensors-23-01076-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/123d139dc66e/sensors-23-01076-g021a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/c3fabb360347/sensors-23-01076-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/c21674dc55c0/sensors-23-01076-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/c5dae4334f89/sensors-23-01076-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/8a8a19cf1aff/sensors-23-01076-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/d48da0226f34/sensors-23-01076-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/214b652158f4/sensors-23-01076-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/bd581a9f2e08/sensors-23-01076-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/9d52fdbd7b0d/sensors-23-01076-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/2c5b490ed07c/sensors-23-01076-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/9392d1bf1a9a/sensors-23-01076-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/3f53c421e499/sensors-23-01076-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/7b5aabf014ca/sensors-23-01076-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/92e742441b83/sensors-23-01076-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/58676bc4cebb/sensors-23-01076-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/2ed3a1d2f61e/sensors-23-01076-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/cd704cfd5192/sensors-23-01076-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/2a3f47c926ef/sensors-23-01076-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/b798e59e23da/sensors-23-01076-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/d7d76533e4df/sensors-23-01076-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/b4f08edcaced/sensors-23-01076-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c9e2/9919884/c6a4c17ee571/sensors-23-01076-g027.jpg

相似文献

1
Eye-in-Hand Robotic Arm Gripping System Based on Machine Learning and State Delay Optimization.基于机器学习和状态时滞优化的机器手眼抓取系统。
Sensors (Basel). 2023 Jan 17;23(3):1076. doi: 10.3390/s23031076.
2
GadgetArm-Automatic Grasp Generation and Manipulation of 4-DOF Robot Arm for Arbitrary Objects Through Reinforcement Learning.GadgetArm:通过强化学习实现四自由度机器人手臂对任意物体的自动抓取生成与操作
Sensors (Basel). 2020 Oct 30;20(21):6183. doi: 10.3390/s20216183.
3
A compact underactuated gripper with two fingers and a retractable suction cup.一种紧凑的欠驱动夹具,有两个手指和一个可伸缩的吸盘。
Front Robot AI. 2023 Apr 17;10:1066516. doi: 10.3389/frobt.2023.1066516. eCollection 2023.
4
Optimal Design of a Soft Robotic Gripper for Grasping Unknown Objects.用于抓取未知物体的软体机器人夹持器的优化设计。
Soft Robot. 2018 Aug;5(4):452-465. doi: 10.1089/soro.2017.0121. Epub 2018 May 9.
5
Combining Sensors Information to Enhance Pneumatic Grippers Performance.结合传感器信息以提高气动夹爪的性能。
Sensors (Basel). 2021 Jul 24;21(15):5020. doi: 10.3390/s21155020.
6
Sensor-Less and Control-Less Underactuated Grippers With Pull-In Mechanisms for Grasping Various Objects.具有拉入机构的无传感器和无控制欠驱动抓手,用于抓取各种物体。
Front Robot AI. 2021 Feb 22;8:631242. doi: 10.3389/frobt.2021.631242. eCollection 2021.
7
TriTrap: A Robotic Gripper Inspired by Insect Tarsal Chains.TriTrap:一种受昆虫跗骨链启发的机器人夹具。
Biomimetics (Basel). 2024 Feb 26;9(3):142. doi: 10.3390/biomimetics9030142.
8
Dataset with Tactile and Kinesthetic Information from a Human Forearm and Its Application to Deep Learning.数据集包含来自人体前臂的触觉和动觉信息及其在深度学习中的应用。
Sensors (Basel). 2022 Nov 12;22(22):8752. doi: 10.3390/s22228752.
9
Machine Learning Techniques for Increasing Efficiency of the Robot's Sensor and Control Information Processing.机器学习技术提高机器人传感器和控制信息处理效率。
Sensors (Basel). 2022 Jan 29;22(3):1062. doi: 10.3390/s22031062.
10
Rod-based Fabrication of Customizable Soft Robotic Pneumatic Gripper Devices for Delicate Tissue Manipulation.用于精细组织操作的可定制软机器人气动夹爪装置的基于杆的制造。
J Vis Exp. 2016 Aug 2(114):54175. doi: 10.3791/54175.

引用本文的文献

1
An End-to-End Computationally Lightweight Vision-Based Grasping System for Grocery Items.一种用于杂货物品的基于视觉的端到端计算轻量级抓取系统。
Sensors (Basel). 2025 Aug 26;25(17):5309. doi: 10.3390/s25175309.
2
Submillimeter-Accurate Markerless Hand-Eye Calibration Based on a Robot's Flange Features.基于机器人法兰特征的亚毫米精度无标记手眼校准
Sensors (Basel). 2024 Feb 7;24(4):1071. doi: 10.3390/s24041071.

本文引用的文献

1
Development of a Robot Arm Link System Embedded with a Three-Axis Sensor with a Simple Structure Capable of Excellent External Collision Detection.一种结构简单、外部碰撞检测能力出色的三轴传感器嵌入式机器人臂连杆系统的开发。
Sensors (Basel). 2022 Feb 5;22(3):1222. doi: 10.3390/s22031222.
2
A Case Study of Upper Limb Robotic-Assisted Therapy Using the Track-Hold Device.使用跟踪保持装置的上肢机器人辅助治疗案例研究。
Sensors (Basel). 2022 Jan 28;22(3):1009. doi: 10.3390/s22031009.
3
Real-Time Fruit Recognition and Grasping Estimation for Robotic Apple Harvesting.
实时水果识别与机器人采摘苹果的抓取估计。
Sensors (Basel). 2020 Oct 4;20(19):5670. doi: 10.3390/s20195670.
4
Robot Intelligent Grasp of Unknown Objects Based on Multi-Sensor Information.基于多传感器信息的未知物体机器人智能抓取
Sensors (Basel). 2019 Apr 2;19(7):1595. doi: 10.3390/s19071595.
5
State observer based robust adaptive fuzzy controller for nonlinear uncertain and perturbed systems.基于状态观测器的非线性不确定和受扰系统鲁棒自适应模糊控制器
IEEE Trans Syst Man Cybern B Cybern. 2004 Apr;34(2):942-50. doi: 10.1109/tsmcb.2003.818562.