• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

参考框架在人机安全交互中的作用。

Role of Reference Frames for a Safe Human-Robot Interaction.

机构信息

Mechanical and Industrial Engineering Department, Università degli Studi di Brescia, Via Branze 38, 25123 Brescia, Italy.

STIIMA-CNR-Institute of Intelligent Industrial Technologies and System, National Researcher Council of Italy, 00185 Roma, Italy.

出版信息

Sensors (Basel). 2023 Jun 20;23(12):5762. doi: 10.3390/s23125762.

DOI:10.3390/s23125762
PMID:37420924
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10304731/
Abstract

Safety plays a key role in human-robot interactions in collaborative robot (cobot) applications. This paper provides a general procedure to guarantee safe workstations allowing human operations, robot contributions, the dynamical environment, and time-variant objects in a set of collaborative robotic tasks. The proposed methodology focuses on the contribution and the mapping of reference frames. Multiple reference frame representation agents are defined at the same time by considering egocentric, allocentric, and route-centric perspectives. The agents are processed to provide a minimal and effective assessment of the ongoing human-robot interactions. The proposed formulation is based on the generalization and proper synthesis of multiple cooperating reference frame agents at the same time. Accordingly, it is possible to achieve a real-time assessment of the safety-related implications through the implementation and fast calculation of proper safety-related quantitative indices. This allows us to define and promptly regulate the controlling parameters of the involved cobot without velocity limitations that are recognized as the main disadvantage. A set of experiments has been realized and investigated to demonstrate the feasibility and effectiveness of the research by using a seven-DOF anthropomorphic arm in combination with a psychometric test. The acquired results agree with the current literature in terms of the kinematic, position, and velocity aspects; use measurement methods based on tests provided to the operator; and introduce novel features of work cell arranging, including the use of virtual instrumentation. Finally, the associated analytical-topological treatments have enabled the development of a safe and comfortable measure to the human-robot relation with satisfactory experimental results compared to previous research. Nevertheless, the robot posture, human perception, and learning technologies would have to apply research from multidisciplinary fields such as psychology, gesture, communication, and social sciences in order to be prepared for positioning in real-world applications that offer new challenges for cobot applications.

摘要

安全性在协作机器人(cobot)应用中的人机交互中起着关键作用。本文提供了一种通用程序,以保证安全工作站允许人类操作、机器人贡献、动态环境和一组协作机器人任务中的时变物体。所提出的方法侧重于参考框架的贡献和映射。通过考虑自我中心、分配中心和路线中心的观点,同时定义了多个参考框架表示代理。代理被处理,以提供对正在进行的人机交互的最小和有效的评估。所提出的公式基于多个协作参考框架代理的推广和适当合成。因此,通过实施和快速计算适当的安全相关定量指标,可以实时评估与安全相关的影响。这允许我们定义并及时调节所涉及的 cobot 的控制参数,而不会受到被认为是主要缺点的速度限制。已经进行了一系列实验来验证该研究的可行性和有效性,使用了一个七自由度拟人臂与心理测试相结合。所获得的结果在运动学、位置和速度方面与当前文献一致;使用基于提供给操作员的测试的测量方法;并引入了工作单元布置的新特征,包括虚拟仪器的使用。最后,相关的分析拓扑处理使得能够开发一种安全和舒适的人机关系测量方法,与之前的研究相比,实验结果令人满意。然而,机器人姿势、人类感知和学习技术将不得不应用多学科领域的研究,如心理学、手势、通信和社会科学,以便为在提供新挑战的现实世界应用中进行定位做好准备。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/a524039c5b01/sensors-23-05762-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/12ad258e6708/sensors-23-05762-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/2d05caa4066b/sensors-23-05762-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/b10a0dadcd02/sensors-23-05762-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/1fde28244ad1/sensors-23-05762-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/9966c3c61181/sensors-23-05762-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/2815cd835de3/sensors-23-05762-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/2a73d584bf67/sensors-23-05762-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/479e3e88e505/sensors-23-05762-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/d5fd723d27a0/sensors-23-05762-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/da73f2afedf2/sensors-23-05762-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/602fcf55720a/sensors-23-05762-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/66d2a78127bd/sensors-23-05762-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/a524039c5b01/sensors-23-05762-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/12ad258e6708/sensors-23-05762-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/2d05caa4066b/sensors-23-05762-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/b10a0dadcd02/sensors-23-05762-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/1fde28244ad1/sensors-23-05762-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/9966c3c61181/sensors-23-05762-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/2815cd835de3/sensors-23-05762-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/2a73d584bf67/sensors-23-05762-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/479e3e88e505/sensors-23-05762-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/d5fd723d27a0/sensors-23-05762-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/da73f2afedf2/sensors-23-05762-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/602fcf55720a/sensors-23-05762-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/66d2a78127bd/sensors-23-05762-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/4008/10304731/a524039c5b01/sensors-23-05762-g013.jpg

相似文献

1
Role of Reference Frames for a Safe Human-Robot Interaction.参考框架在人机安全交互中的作用。
Sensors (Basel). 2023 Jun 20;23(12):5762. doi: 10.3390/s23125762.
2
A Unified Multimodal Interface for the RELAX High-Payload Collaborative Robot.用于 RELAX 高负载协作机器人的统一多模态接口。
Sensors (Basel). 2023 Sep 7;23(18):7735. doi: 10.3390/s23187735.
3
Egocentric Gesture Recognition Using 3D Convolutional Neural Networks for the Spatiotemporal Adaptation of Collaborative Robots.使用3D卷积神经网络进行协作机器人时空自适应的自我中心手势识别
Front Neurorobot. 2021 Nov 23;15:703545. doi: 10.3389/fnbot.2021.703545. eCollection 2021.
4
Emotion-Driven Analysis and Control of Human-Robot Interactions in Collaborative Applications.协作应用中情感驱动的人机交互分析与控制。
Sensors (Basel). 2021 Jul 6;21(14):4626. doi: 10.3390/s21144626.
5
Open core control software for surgical robots.手术机器人的开源核心控制软件。
Int J Comput Assist Radiol Surg. 2010 May;5(3):211-20. doi: 10.1007/s11548-009-0388-9. Epub 2009 Jul 28.
6
A Mixed-Perception Approach for Safe Human-Robot Collaboration in Industrial Automation.一种用于工业自动化中安全人机协作的混合感知方法。
Sensors (Basel). 2020 Nov 7;20(21):6347. doi: 10.3390/s20216347.
7
Cobot Motion Planning Algorithm for Ensuring Human Safety Based on Behavioral Dynamics.基于行为动力学的保障人机安全的协作机器人运动规划算法。
Sensors (Basel). 2022 Jun 9;22(12):4376. doi: 10.3390/s22124376.
8
A Human-Following Motion Planning and Control Scheme for Collaborative Robots Based on Human Motion Prediction.基于人体运动预测的协作机器人的跟随人体运动规划与控制方案。
Sensors (Basel). 2021 Dec 9;21(24):8229. doi: 10.3390/s21248229.
9
Assisting Operators in Heavy Industrial Tasks: On the Design of an Optimized Cooperative Impedance Fuzzy-Controller With Embedded Safety Rules.协助操作员执行重工业任务:关于一种嵌入安全规则的优化协作阻抗模糊控制器的设计
Front Robot AI. 2019 Aug 21;6:75. doi: 10.3389/frobt.2019.00075. eCollection 2019.
10
A Tandem Robotic Arm Inverse Kinematic Solution Based on an Improved Particle Swarm Algorithm.一种基于改进粒子群算法的串联机器人手臂逆运动学求解方法。
Front Bioeng Biotechnol. 2022 May 19;10:832829. doi: 10.3389/fbioe.2022.832829. eCollection 2022.

本文引用的文献

1
Estimating the Orientation of Objects from Tactile Sensing Data Using Machine Learning Methods and Visual Frames of Reference.使用机器学习方法和视觉参考框架从触觉传感数据估计物体的方向。
Sensors (Basel). 2019 May 17;19(10):2285. doi: 10.3390/s19102285.
2
Design and validation of an open-source library of dynamic reference frames for research and education in optical tracking.用于光学跟踪研究与教育的动态参考框架开源库的设计与验证
J Med Imaging (Bellingham). 2018 Apr;5(2):021215. doi: 10.1117/1.JMI.5.2.021215. Epub 2018 Feb 9.
3
A Novel Real-Time Reference Key Frame Scan Matching Method.
一种新型实时参考关键帧扫描匹配方法。
Sensors (Basel). 2017 May 7;17(5):1060. doi: 10.3390/s17051060.
4
A Computational Model for Spatial Navigation Based on Reference Frames in the Hippocampus, Retrosplenial Cortex, and Posterior Parietal Cortex.一种基于海马体、压后皮质和顶叶后皮质中参考框架的空间导航计算模型。
Front Neurorobot. 2017 Feb 7;11:4. doi: 10.3389/fnbot.2017.00004. eCollection 2017.
5
Implementing Speed and Separation Monitoring in Collaborative Robot Workcells.在协作机器人工作单元中实施速度和间距监测
Robot Comput Integr Manuf. 2017 Apr;44:144-155. doi: 10.1016/j.rcim.2016.08.001. Epub 2016 Aug 27.
6
Real-Time Performance of Mechatronic PZT Module Using Active Vibration Feedback Control.采用主动振动反馈控制的机电一体化压电陶瓷(PZT)模块的实时性能
Sensors (Basel). 2016 Sep 25;16(10):1577. doi: 10.3390/s16101577.
7
Reliability of handgrip strength test in elderly subjects with unilateral thumb carpometacarpal osteoarthritis.老年单侧拇指腕掌关节骨关节炎患者握力测试的可靠性
Hand (N Y). 2015 Jun;10(2):205-9. doi: 10.1007/s11552-014-9678-y.
8
Flying over uneven moving terrain based on optic-flow cues without any need for reference frames or accelerometers.基于光流线索在不平坦的移动地形上飞行,无需任何参考系或加速度计。
Bioinspir Biomim. 2015 Feb 26;10(2):026003. doi: 10.1088/1748-3182/10/2/026003.
9
A survey on model based approaches for 2D and 3D visual human pose recovery.基于模型的二维和三维视觉人体姿态恢复方法的调查。
Sensors (Basel). 2014 Mar 3;14(3):4189-210. doi: 10.3390/s140304189.