• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于机器视觉与深度学习相结合的多孔定位跟踪研究

Research on Multi-Hole Localization Tracking Based on a Combination of Machine Vision and Deep Learning.

作者信息

Hou Rong, Yin Jianping, Liu Yanchen, Lu Huijuan

机构信息

School of Mechanical and Electrical Engineering, North University of China, Taiyuan 030051, China.

School of Life and Environmental Sciences, Guilin University of Electronic Technology, Guilin 541004, China.

出版信息

Sensors (Basel). 2024 Feb 2;24(3):984. doi: 10.3390/s24030984.

DOI:10.3390/s24030984
PMID:38339701
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10857067/
Abstract

In the process of industrial production, manual assembly of workpieces exists with low efficiency and high intensity, and some of the assembly process of the human body has a certain degree of danger. At the same time, traditional machine learning algorithms are difficult to adapt to the complexity of the current industrial field environment; the change in the environment will greatly affect the accuracy of the robot's work. Therefore, this paper proposes a method based on the combination of machine vision and the YOLOv5 deep learning model to obtain the disk porous localization information, after coordinate mapping by the ROS communication control robotic arm work, in order to improve the anti-interference ability of the environment and work efficiency but also reduce the danger to the human body. The system utilizes a camera to collect real-time images of targets in complex environments and, then, trains and processes them for recognition such that coordinate localization information can be obtained. This information is converted into coordinates under the robot coordinate system through hand-eye calibration, and the robot is then controlled to complete multi-hole localization and tracking by means of communication between the upper and lower computers. The results show that there is a high accuracy in the training and testing of the target object, and the control accuracy of the robotic arm is also relatively high. The method has strong anti-interference to the complex environment of industry and exhibits a certain feasibility and effectiveness. It lays a foundation for achieving the automated installation of docking disk workpieces in industrial production and also provides a more favorable choice for the production and installation of the process of screw positioning needs.

摘要

在工业生产过程中,工件的人工装配存在效率低、强度大的问题,并且人体的一些装配过程存在一定程度的危险。同时,传统机器学习算法难以适应当前工业现场环境的复杂性;环境变化会极大地影响机器人工作的准确性。因此,本文提出一种基于机器视觉与YOLOv5深度学习模型相结合的方法,获取盘状多孔定位信息,经ROS通信控制机器人手臂工作进行坐标映射,以提高环境抗干扰能力和工作效率,同时降低对人体的危险。该系统利用摄像头采集复杂环境下目标的实时图像,然后对其进行训练和处理以实现识别,从而获得坐标定位信息。通过手眼标定将该信息转换为机器人坐标系下的坐标,再通过上下位机通信控制机器人完成多孔定位与跟踪。结果表明,目标物体的训练和测试精度较高,机器人手臂的控制精度也相对较高。该方法对工业复杂环境具有较强的抗干扰能力,具有一定的可行性和有效性。它为实现工业生产中对接盘状工件的自动化安装奠定了基础,也为螺丝定位需求的生产安装过程提供了更有利的选择。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/a58daebb78c3/sensors-24-00984-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/358aa14e6084/sensors-24-00984-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/ac944e3cba8a/sensors-24-00984-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/616ac0643dc4/sensors-24-00984-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/96c4fa3e2517/sensors-24-00984-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/780623ab6a94/sensors-24-00984-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/4ee179862d01/sensors-24-00984-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/fbd3559c2f9b/sensors-24-00984-g007a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/24f84ac77b3c/sensors-24-00984-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/ef3ad0eaaf99/sensors-24-00984-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/f176cdc0b98a/sensors-24-00984-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/786f2e420bac/sensors-24-00984-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/9e711d5c2df4/sensors-24-00984-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/373752759242/sensors-24-00984-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/f7b03d7e18fa/sensors-24-00984-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/a3f84aeec13f/sensors-24-00984-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/b63a72451048/sensors-24-00984-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/6b83702f11b5/sensors-24-00984-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/e741f9e318f2/sensors-24-00984-g018a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/a484bb063e2c/sensors-24-00984-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/6fed87a8f5fc/sensors-24-00984-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/ff6fba9a75c5/sensors-24-00984-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/09b40272fe0e/sensors-24-00984-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/69b61336f851/sensors-24-00984-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/76930a657171/sensors-24-00984-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/aae2ea2cbd40/sensors-24-00984-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/a58daebb78c3/sensors-24-00984-g026.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/358aa14e6084/sensors-24-00984-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/ac944e3cba8a/sensors-24-00984-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/616ac0643dc4/sensors-24-00984-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/96c4fa3e2517/sensors-24-00984-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/780623ab6a94/sensors-24-00984-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/4ee179862d01/sensors-24-00984-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/fbd3559c2f9b/sensors-24-00984-g007a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/24f84ac77b3c/sensors-24-00984-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/ef3ad0eaaf99/sensors-24-00984-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/f176cdc0b98a/sensors-24-00984-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/786f2e420bac/sensors-24-00984-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/9e711d5c2df4/sensors-24-00984-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/373752759242/sensors-24-00984-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/f7b03d7e18fa/sensors-24-00984-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/a3f84aeec13f/sensors-24-00984-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/b63a72451048/sensors-24-00984-g016.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/6b83702f11b5/sensors-24-00984-g017.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/e741f9e318f2/sensors-24-00984-g018a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/a484bb063e2c/sensors-24-00984-g019.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/6fed87a8f5fc/sensors-24-00984-g020.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/ff6fba9a75c5/sensors-24-00984-g021.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/09b40272fe0e/sensors-24-00984-g022.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/69b61336f851/sensors-24-00984-g023.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/76930a657171/sensors-24-00984-g024.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/aae2ea2cbd40/sensors-24-00984-g025.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/7b86/10857067/a58daebb78c3/sensors-24-00984-g026.jpg

相似文献

1
Research on Multi-Hole Localization Tracking Based on a Combination of Machine Vision and Deep Learning.基于机器视觉与深度学习相结合的多孔定位跟踪研究
Sensors (Basel). 2024 Feb 2;24(3):984. doi: 10.3390/s24030984.
2
Machine Learning Techniques for Increasing Efficiency of the Robot's Sensor and Control Information Processing.机器学习技术提高机器人传感器和控制信息处理效率。
Sensors (Basel). 2022 Jan 29;22(3):1062. doi: 10.3390/s22031062.
3
A novel multidimensional uncalibration method applied to six-axis manipulators.一种应用于六轴机械手的新型多维无校准方法。
Front Neurosci. 2023 Jul 14;17:1221740. doi: 10.3389/fnins.2023.1221740. eCollection 2023.
4
A novel hand-eye calibration method of picking robot based on TOF camera.一种基于TOF相机的抓取机器人手眼标定新方法。
Front Plant Sci. 2023 Jan 17;13:1099033. doi: 10.3389/fpls.2022.1099033. eCollection 2022.
5
A Tandem Robotic Arm Inverse Kinematic Solution Based on an Improved Particle Swarm Algorithm.一种基于改进粒子群算法的串联机器人手臂逆运动学求解方法。
Front Bioeng Biotechnol. 2022 May 19;10:832829. doi: 10.3389/fbioe.2022.832829. eCollection 2022.
6
Submillimeter-Accurate Markerless Hand-Eye Calibration Based on a Robot's Flange Features.基于机器人法兰特征的亚毫米精度无标记手眼校准
Sensors (Basel). 2024 Feb 7;24(4):1071. doi: 10.3390/s24041071.
7
Recognition and Counting of Apples in a Dynamic State Using a 3D Camera and Deep Learning Algorithms for Robotic Harvesting Systems.使用 3D 相机和深度学习算法识别和计数动态状态下的苹果,用于机器人采摘系统。
Sensors (Basel). 2023 Apr 7;23(8):3810. doi: 10.3390/s23083810.
8
Deep Q-Learning in Robotics: Improvement of Accuracy and Repeatability.机器人深度学习:提高准确性和可重复性。
Sensors (Basel). 2022 May 21;22(10):3911. doi: 10.3390/s22103911.
9
No-code robotic programming for agile production: A new markerless-approach for multimodal natural interaction in a human-robot collaboration context.用于敏捷生产的无代码机器人编程:一种在人机协作环境中实现多模态自然交互的新型无标记方法。
Front Robot AI. 2022 Oct 4;9:1001955. doi: 10.3389/frobt.2022.1001955. eCollection 2022.
10
Applying High-Speed Vision Sensing to an Industrial Robot for High-Performance Position Regulation under Uncertainties.将高速视觉传感应用于工业机器人以在不确定性下进行高性能位置调节。
Sensors (Basel). 2016 Jul 29;16(8):1195. doi: 10.3390/s16081195.

引用本文的文献

1
Secure Fusion with Labeled Multi-Bernoulli Filter for Multisensor Multitarget Tracking Against False Data Injection Attacks.基于标记多伯努利滤波器的安全融合用于多传感器多目标跟踪以抵御虚假数据注入攻击
Sensors (Basel). 2025 Jun 3;25(11):3526. doi: 10.3390/s25113526.
2
Research on the Digital Twin System of Welding Robots Driven by Data.数据驱动的焊接机器人数字孪生系统研究
Sensors (Basel). 2025 Jun 22;25(13):3889. doi: 10.3390/s25133889.

本文引用的文献

1
Investigation of active tracking for robotic arm assisted magnetic resonance guided focused ultrasound ablation.用于机器人手臂辅助磁共振引导聚焦超声消融的主动跟踪研究。
Int J Med Robot. 2017 Sep;13(3). doi: 10.1002/rcs.1768. Epub 2016 Aug 24.