• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

离线和在线模式下通过手势和语音命令控制工业机器人。

Industrial Robot Control by Means of Gestures and Voice Commands in Off-Line and On-Line Mode.

机构信息

Faculty of Mechatronics, Armament and Aerospace, Military University of Technology, Kaliskiego 2 Street, 00-908 Warsaw, Poland.

出版信息

Sensors (Basel). 2020 Nov 7;20(21):6358. doi: 10.3390/s20216358.

DOI:10.3390/s20216358
PMID:33171844
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7664672/
Abstract

The paper presents the possibility of using the Kinect v2 module to control an industrial robot by means of gestures and voice commands. It describes the elements of creating software for off-line and on-line robot control. The application for the Kinect module was developed in the C# language in the Visual Studio environment, while the industrial robot control program was developed in the RAPID language in the RobotStudio environment. The development of a two-threaded application in the RAPID language allowed separating two independent tasks for the IRB120 robot. The main task of the robot is performed in Thread No. 1 (responsible for movement). Simultaneously, Thread No. 2 ensures continuous communication with the Kinect system and provides information about the gesture and voice commands in real time without any interference in Thread No. 1. The applied solution allows the robot to work in industrial conditions without the negative impact of the communication task on the time of the robot's work cycles. Thanks to the development of a digital twin of the real robot station, tests of proper application functioning in off-line mode (without using a real robot) were conducted. The obtained results were verified on-line (on the real test station). Tests of the correctness of gesture recognition were carried out, and the robot recognized all programmed gestures. Another test carried out was the recognition and execution of voice commands. A difference in the time of task completion between the actual and virtual station was noticed; the average difference was 0.67 s. The last test carried out was to examine the impact of interference on the recognition of voice commands. With a 10 dB difference between the command and noise, the recognition of voice commands was equal to 91.43%. The developed computer programs have a modular structure, which enables easy adaptation to process requirements.

摘要

本文提出了一种使用 Kinect v2 模块通过手势和语音命令来控制工业机器人的可能性。它描述了创建离线和在线机器人控制软件的要素。Kinect 模块的应用程序是在 Visual Studio 环境中使用 C# 语言开发的,而工业机器人控制程序是在 RobotStudio 环境中使用 RAPID 语言开发的。在 RAPID 语言中开发的双线程应用程序允许为 IRB120 机器人分离两个独立的任务。机器人的主要任务在线程 No.1 中执行(负责运动)。同时,线程 No.2 确保与 Kinect 系统的连续通信,并实时提供有关手势和语音命令的信息,而不会对线程 No.1 造成任何干扰。应用的解决方案允许机器人在工业条件下工作,而不会对机器人工作周期时间产生通信任务的负面影响。由于开发了真实机器人站的数字孪生体,可以在离线模式(不使用真实机器人)下进行适当应用功能的测试。在线(在真实测试站上)进行了测试,以验证获得的结果。进行了手势识别正确性的测试,机器人识别了所有编程的手势。另一个进行的测试是语音命令的识别和执行。注意到实际和虚拟站之间完成任务的时间差异;平均差异为 0.67 秒。最后进行的测试是检查干扰对语音命令识别的影响。在命令和噪声之间有 10dB 的差异时,语音命令的识别率等于 91.43%。开发的计算机程序具有模块化结构,这使得易于适应处理要求。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/ab5dc30f6417/sensors-20-06358-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/1f82dd3c2d9f/sensors-20-06358-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/c6954f937367/sensors-20-06358-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/b909a6b4abf5/sensors-20-06358-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/c95026e5cbd9/sensors-20-06358-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/efce5298e6e5/sensors-20-06358-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/87c33354d76e/sensors-20-06358-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/a2c3a618f9f5/sensors-20-06358-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/6c3c13a59665/sensors-20-06358-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/ab5dc30f6417/sensors-20-06358-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/1f82dd3c2d9f/sensors-20-06358-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/c6954f937367/sensors-20-06358-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/b909a6b4abf5/sensors-20-06358-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/c95026e5cbd9/sensors-20-06358-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/efce5298e6e5/sensors-20-06358-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/87c33354d76e/sensors-20-06358-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/a2c3a618f9f5/sensors-20-06358-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/6c3c13a59665/sensors-20-06358-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ec3f/7664672/ab5dc30f6417/sensors-20-06358-g009.jpg

相似文献

1
Industrial Robot Control by Means of Gestures and Voice Commands in Off-Line and On-Line Mode.离线和在线模式下通过手势和语音命令控制工业机器人。
Sensors (Basel). 2020 Nov 7;20(21):6358. doi: 10.3390/s20216358.
2
Controlling an Industrial Robot Using a Graphic Tablet in Offline and Online Mode.使用图形输入板在离线和在线模式下控制工业机器人。
Sensors (Basel). 2021 Apr 1;21(7):2439. doi: 10.3390/s21072439.
3
Speech and motion control for interventional radiology: requirements and feasibility.介入放射学的语音与运动控制:要求与可行性
Int J Comput Assist Radiol Surg. 2013 Nov;8(6):997-1002. doi: 10.1007/s11548-013-0841-7. Epub 2013 Apr 13.
4
Integration of Industrially-Oriented Human-Robot Speech Communication and Vision-Based Object Recognition.工业导向的人机语音通信与基于视觉的物体识别的集成。
Sensors (Basel). 2020 Dec 18;20(24):7287. doi: 10.3390/s20247287.
5
Robot Learning of Assistive Manipulation Tasks by Demonstration via Head Gesture-based Interface.通过基于头部手势的界面进行演示实现机器人对辅助操作任务的学习。
IEEE Int Conf Rehabil Robot. 2019 Jun;2019:1139-1146. doi: 10.1109/ICORR.2019.8779379.
6
Improving gesture-based interaction between an assistive bathing robot and older adults via user training on the gestural commands.通过对老年人进行手势命令的用户培训,改善辅助沐浴机器人与老年人之间基于手势的交互。
Arch Gerontol Geriatr. 2020 Mar-Apr;87:103996. doi: 10.1016/j.archger.2019.103996. Epub 2019 Dec 13.
7
Robust Understanding of Robot-Directed Speech Commands Using Sequence to Sequence With Noise Injection.通过带噪声注入的序列到序列模型对机器人导向语音命令的稳健理解
Front Robot AI. 2020 Jan 14;6:144. doi: 10.3389/frobt.2019.00144. eCollection 2019.
8
Navigation of a virtual exercise environment with Microsoft Kinect by people post-stroke or with cerebral palsy.中风后患者或脑瘫患者使用微软Kinect在虚拟运动环境中进行导航。
Assist Technol. 2016 Winter;28(4):225-232. doi: 10.1080/10400435.2016.1167789. Epub 2016 Aug 31.
9
Single-Equipment with Multiple-Application for an Automated Robot-Car Control System.单设备多用途的自动化机器人控制系统。
Sensors (Basel). 2019 Feb 6;19(3):662. doi: 10.3390/s19030662.
10
Hand Gesture Interface for Robot Path Definition in Collaborative Applications: Implementation and Comparative Study.协作应用中机器人路径定义的手势界面:实现与比较研究。
Sensors (Basel). 2023 Apr 23;23(9):4219. doi: 10.3390/s23094219.

引用本文的文献

1
Application for Recognizing Sign Language Gestures Based on an Artificial Neural Network.基于人工神经网络的手语识别申请。
Sensors (Basel). 2022 Dec 15;22(24):9864. doi: 10.3390/s22249864.
2
A Mixed-Reality Tele-Operation Method for High-Level Control of a Legged-Manipulator Robot.一种用于腿足式机器人高层控制的混合现实遥操作方法。
Sensors (Basel). 2022 Oct 24;22(21):8146. doi: 10.3390/s22218146.
3
Controlling an Industrial Robot Using a Graphic Tablet in Offline and Online Mode.使用图形输入板在离线和在线模式下控制工业机器人。

本文引用的文献

1
Selection and Optimization of the Parameters of the Robotized Packaging Process of One Type of Product.一种产品的机器人包装工艺参数的选择与优化。
Sensors (Basel). 2020 Sep 19;20(18):5378. doi: 10.3390/s20185378.
2
Analysis of the Kinetics of Swimming Pool Water Reaction in Analytical Device Reproducing Its Circulation on a Small Scale.分析小型循环分析设备中游泳池水反应动力学。
Sensors (Basel). 2020 Aug 26;20(17):4820. doi: 10.3390/s20174820.
3
LiDAR-Based System and Optical VHR Data for Building Detection and Mapping.
Sensors (Basel). 2021 Apr 1;21(7):2439. doi: 10.3390/s21072439.
基于激光雷达的系统与光学甚高分辨率数据用于建筑物检测与测绘。
Sensors (Basel). 2020 Feb 27;20(5):1285. doi: 10.3390/s20051285.
4
An Improved Point Cloud Descriptor for Vision Based Robotic Grasping System.一种用于基于视觉的机器人抓取系统的改进点云描述符。
Sensors (Basel). 2019 May 14;19(10):2225. doi: 10.3390/s19102225.
5
Embedded Processing and Compression of 3D Sensor Data for Large Scale Industrial Environments.面向大规模工业环境的 3D 传感器数据的嵌入式处理和压缩。
Sensors (Basel). 2019 Feb 2;19(3):636. doi: 10.3390/s19030636.
6
Three-Dimensional Object Recognition and Registration for Robotic Grasping Systems Using a Modified Viewpoint Feature Histogram.使用改进的视点特征直方图的机器人抓取系统的三维物体识别与配准
Sensors (Basel). 2016 Nov 23;16(11):1969. doi: 10.3390/s16111969.