• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种基于YOLO-SimAM-GRCNN的智能乳化炸药抓取装填系统。

An intelligent emulsion explosive grasping and filling system based on YOLO-SimAM-GRCNN.

作者信息

Yi Jiangang, Liu Peng, Gao Jun, Yuan Rui, Wu Jiajun

机构信息

State Key Laboratory of Precision Blasting, Jianghan University, Wuhan, 430056, China.

Hubei Key Laboratory of Industrial Fume and Dust Pollution Control, Jianghan University, Wuhan, 430056, China.

出版信息

Sci Rep. 2024 Nov 18;14(1):28425. doi: 10.1038/s41598-024-77034-0.

DOI:10.1038/s41598-024-77034-0
PMID:39557931
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11574010/
Abstract

For the blasting scenario, our research develops an emulsion explosive grasping and filling system suitable for tunnel robots. Firstly, we designed a system, YOLO-SimAM-GRCNN, which consists of an inference module and a control module. The inference module primarily consists of a blast hole position detection network based on YOLOv8 and an explosive grasping network based on SimAM-GRCNN. The control module plans and executes the robot's motion control based on the output of the inference module to achieve symmetric grasping and filling operations. Meanwhile, The SimAM-GRCNN grasping network model is utilized to carry out comparative evaluated on the Cornell and Jacquard dataset, achieving a grasping detection accuracy of 98.8% and 95.2%, respectively. In addition, on a self-built emulsion explosive dataset, the grasping detection accuracy reaches 96.4%. The SimAM-GRCNN grasping network model outperforms the original GRCNN by an average of 1.7% in accuracy, achieving a balance between blast holes detection, grasping accuracy and filling speed. Finally, experiments are conducted on the Universal Robots 3 manipulator arm, using distributed deployment and manipulator arm motion control mode to achieve an end-to-end grasping and filling process. On the Jetson Xavier NX development board, the average time consumption is 119.67 s, with average success rates of 87.1% for grasping and 79.2% for filling emulsion explosives.

摘要

对于爆破场景,我们的研究开发了一种适用于隧道机器人的乳化炸药抓取与装填系统。首先,我们设计了一个名为YOLO-SimAM-GRCNN的系统,它由推理模块和控制模块组成。推理模块主要由基于YOLOv8的炮孔位置检测网络和基于SimAM-GRCNN的炸药抓取网络组成。控制模块根据推理模块的输出规划并执行机器人的运动控制,以实现对称抓取和装填操作。同时,利用SimAM-GRCNN抓取网络模型在康奈尔数据集和雅卡尔数据集上进行对比评估,抓取检测准确率分别达到98.8%和95.2%。此外,在自建的乳化炸药数据集上,抓取检测准确率达到96.4%。SimAM-GRCNN抓取网络模型在准确率上比原始GRCNN平均高出1.7%,在炮孔检测、抓取准确率和装填速度之间实现了平衡。最后,在通用机器人3机械手手臂上进行实验,采用分布式部署和机械手手臂运动控制模式实现端到端的抓取和装填过程。在Jetson Xavier NX开发板上,平均耗时119.67秒,抓取乳化炸药的平均成功率为87.1%,装填的平均成功率为79.2%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/9a151e6c1aee/41598_2024_77034_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/e6bacd33c440/41598_2024_77034_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/f7d7ce1733d7/41598_2024_77034_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/ac0bf13732d0/41598_2024_77034_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/dcba8d266337/41598_2024_77034_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/b262a7c78017/41598_2024_77034_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/fa29b4ab11b0/41598_2024_77034_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/ed988c6edf3c/41598_2024_77034_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/ca5a0c872c6d/41598_2024_77034_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/ae3fc6da0d56/41598_2024_77034_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/15e939049ae0/41598_2024_77034_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/ef81769dd3a3/41598_2024_77034_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/8dc060999781/41598_2024_77034_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/4e9a7f41c656/41598_2024_77034_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/9a151e6c1aee/41598_2024_77034_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/e6bacd33c440/41598_2024_77034_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/f7d7ce1733d7/41598_2024_77034_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/ac0bf13732d0/41598_2024_77034_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/dcba8d266337/41598_2024_77034_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/b262a7c78017/41598_2024_77034_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/fa29b4ab11b0/41598_2024_77034_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/ed988c6edf3c/41598_2024_77034_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/ca5a0c872c6d/41598_2024_77034_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/ae3fc6da0d56/41598_2024_77034_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/15e939049ae0/41598_2024_77034_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/ef81769dd3a3/41598_2024_77034_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/8dc060999781/41598_2024_77034_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/4e9a7f41c656/41598_2024_77034_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/3d79/11574010/9a151e6c1aee/41598_2024_77034_Fig14_HTML.jpg

相似文献

1
An intelligent emulsion explosive grasping and filling system based on YOLO-SimAM-GRCNN.一种基于YOLO-SimAM-GRCNN的智能乳化炸药抓取装填系统。
Sci Rep. 2024 Nov 18;14(1):28425. doi: 10.1038/s41598-024-77034-0.
2
Object Recognition and Grasping for Collaborative Robots Based on Vision.基于视觉的协作机器人目标识别与抓取
Sensors (Basel). 2023 Dec 28;24(1):195. doi: 10.3390/s24010195.
3
Secure Grasping Detection of Objects in Stacked Scenes Based on Single-Frame RGB Images.基于单帧RGB图像的堆叠场景中物体的安全抓取检测
Sensors (Basel). 2023 Sep 24;23(19):8054. doi: 10.3390/s23198054.
4
Grasping detection of dual manipulators based on Markov decision process with neural network.基于神经网络的马尔可夫决策过程的双臂机器人抓取检测。
Neural Netw. 2024 Jan;169:778-792. doi: 10.1016/j.neunet.2023.09.016. Epub 2023 Sep 14.
5
A two-stage grasp detection method for sequential robotic grasping in stacking scenarios.一种用于堆叠场景中机器人顺序抓取的两阶段抓取检测方法。
Math Biosci Eng. 2024 Feb 5;21(2):3448-3472. doi: 10.3934/mbe.2024152.
6
A Real-Time Grasping Detection Network Architecture for Various Grasping Scenarios.一种适用于各种抓取场景的实时抓取检测网络架构。
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):8215-8226. doi: 10.1109/TNNLS.2024.3419180. Epub 2025 May 2.
7
A Method for Real-Time Recognition of Safflower Filaments in Unstructured Environments Using the YOLO-SaFi Model.一种使用YOLO-SaFi模型在非结构化环境中实时识别红花花丝的方法。
Sensors (Basel). 2024 Jul 8;24(13):4410. doi: 10.3390/s24134410.
8
Lightweight strip steel defect detection algorithm based on improved YOLOv7.基于改进YOLOv7的轻质带钢缺陷检测算法
Sci Rep. 2024 Jun 10;14(1):13267. doi: 10.1038/s41598-024-64080-x.
9
SLGA-YOLO: A Lightweight Castings Surface Defect Detection Method Based on Fusion-Enhanced Attention Mechanism and Self-Architecture.SLGA-YOLO:一种基于融合增强注意力机制和自架构的轻量级铸件表面缺陷检测方法。
Sensors (Basel). 2024 Jun 24;24(13):4088. doi: 10.3390/s24134088.
10
Enhanced tomato detection in greenhouse environments: a lightweight model based on S-YOLO with high accuracy.温室环境中番茄检测的增强:一种基于S-YOLO的高精度轻量级模型。
Front Plant Sci. 2024 Aug 22;15:1451018. doi: 10.3389/fpls.2024.1451018. eCollection 2024.

本文引用的文献

1
Intelligent robotics harvesting system process for fruits grasping prediction.智能机器人采摘系统中果实抓取预测的处理过程。
Sci Rep. 2024 Feb 3;14(1):2820. doi: 10.1038/s41598-024-52743-8.
2
Research on Robot Grasping Based on Deep Learning for Real-Life Scenarios.基于深度学习的现实场景机器人抓取研究
Micromachines (Basel). 2023 Jul 8;14(7):1392. doi: 10.3390/mi14071392.
3
GR-ConvNet v2: A Real-Time Multi-Grasp Detection Network for Robotic Grasping.GR-ConvNet v2:一种用于机器人抓取的实时多抓取检测网络。
Sensors (Basel). 2022 Aug 18;22(16):6208. doi: 10.3390/s22166208.
4
Grasping learning, optimization, and knowledge transfer in the robotics field.掌握机器人技术领域的学习、优化和知识转移。
Sci Rep. 2022 Mar 16;12(1):4481. doi: 10.1038/s41598-022-08276-z.
5
A pushing-grasping collaborative method based on deep Q-network algorithm in dual viewpoints.基于双视角深度Q网络算法的推握协作方法
Sci Rep. 2022 Mar 10;12(1):3927. doi: 10.1038/s41598-022-07900-2.
6
Object Detection Method for Grasping Robot Based on Improved YOLOv5.基于改进YOLOv5的抓取机器人目标检测方法
Micromachines (Basel). 2021 Oct 20;12(11):1273. doi: 10.3390/mi12111273.
7
Mirror neurons are modulated by grip force and reward expectation in the sensorimotor cortices (S1, M1, PMd, PMv).镜像神经元在感觉运动皮层(S1、M1、PMd、PMv)中受到握力和奖励预期的调节。
Sci Rep. 2021 Aug 5;11(1):15959. doi: 10.1038/s41598-021-95536-z.
8
DGCM-Net: Dense Geometrical Correspondence Matching Network for Incremental Experience-Based Robotic Grasping.DGCM-Net:用于基于增量经验的机器人抓取的密集几何对应匹配网络
Front Robot AI. 2020 Sep 17;7:120. doi: 10.3389/frobt.2020.00120. eCollection 2020.