• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于改进重定向的多拟人机器人手臂运动模仿与协作

Multi-Humanoid Robot Arm Motion Imitation and Collaboration Based on Improved Retargeting.

作者信息

Jiang Xisheng, Wu Baolei, Li Simin, Zhu Yongtong, Liang Guoxiang, Yuan Ye, Li Qingdu, Zhang Jianwei

机构信息

School of Optoelectronic Information and Computer Engineering, University of Shanghai for Science and Technology, Shanghai 200093, China.

Institute of Machine Intelligence, University of Shanghai for Science and Technology, Shanghai 200093, China.

出版信息

Biomimetics (Basel). 2025 Mar 19;10(3):190. doi: 10.3390/biomimetics10030190.

DOI:10.3390/biomimetics10030190
PMID:40136844
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11939925/
Abstract

Human-robot interaction (HRI) is a key technology in the field of humanoid robotics, and motion imitation is one of the most direct ways to achieve efficient HRI. However, due to significant differences in structure, range of motion, and joint torques between the human body and robots, motion imitation remains a challenging task. Traditional retargeting algorithms, while effective in mapping human motion to robots, typically either ensure similarity in arm configuration (joint space-based) or focus solely on tracking the end-effector position (Cartesian space-based). This creates a conflict between the liveliness and accuracy of robot motion. To address this issue, this paper proposes an improved retargeting algorithm that ensures both the similarity of the robot's arm configuration to that of the human body and accurate end-effector position tracking. Additionally, a multi-person pose estimation algorithm is introduced, enabling real-time capture of multiple imitators' movements using a single RGB-D camera. The captured motion data are used as input to the improved retargeting algorithm, enabling multi-robot collaboration tasks. Experimental results demonstrate that the proposed algorithm effectively ensures consistency in arm configuration and precise end-effector position tracking. Furthermore, the collaborative experiments validate the generalizability of the improved retargeting algorithm and the superior real-time performance of the multi-person pose estimation algorithm.

摘要

人机交互(HRI)是仿人机器人领域的一项关键技术,而动作模仿是实现高效人机交互最直接的方式之一。然而,由于人体与机器人在结构、运动范围和关节扭矩方面存在显著差异,动作模仿仍然是一项具有挑战性的任务。传统的重定向算法虽然能有效地将人类动作映射到机器人上,但通常要么确保手臂配置的相似性(基于关节空间),要么只专注于跟踪末端执行器的位置(基于笛卡尔空间)。这就导致了机器人动作的生动性和准确性之间的冲突。为了解决这个问题,本文提出了一种改进的重定向算法,该算法既能确保机器人手臂配置与人体的相似性,又能精确跟踪末端执行器的位置。此外,还引入了一种多人姿态估计算法,能够使用单个RGB-D相机实时捕捉多个模仿者的动作。捕捉到的运动数据被用作改进的重定向算法的输入,从而实现多机器人协作任务。实验结果表明,所提出的算法有效地确保了手臂配置的一致性和末端执行器位置的精确跟踪。此外,协作实验验证了改进的重定向算法的通用性以及多人姿态估计算法卓越的实时性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/22d887828698/biomimetics-10-00190-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/ef86313fe68a/biomimetics-10-00190-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/02d3e384f314/biomimetics-10-00190-g0A2a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/7c652dd179ec/biomimetics-10-00190-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/1b74d9ad80d9/biomimetics-10-00190-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/227670a8a4bd/biomimetics-10-00190-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/22dc07be0185/biomimetics-10-00190-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/f7253186d052/biomimetics-10-00190-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/de4bdd9480e0/biomimetics-10-00190-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/c08dc1af307a/biomimetics-10-00190-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/72d3a80135f2/biomimetics-10-00190-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/851609b263ac/biomimetics-10-00190-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/246877604dfd/biomimetics-10-00190-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/659716d4c759/biomimetics-10-00190-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/a3357234bc99/biomimetics-10-00190-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/741c454f2802/biomimetics-10-00190-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/e608a0932dd9/biomimetics-10-00190-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/826a18b4bad7/biomimetics-10-00190-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/22d887828698/biomimetics-10-00190-g015.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/ef86313fe68a/biomimetics-10-00190-g0A1.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/02d3e384f314/biomimetics-10-00190-g0A2a.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/7c652dd179ec/biomimetics-10-00190-g0A3.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/1b74d9ad80d9/biomimetics-10-00190-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/227670a8a4bd/biomimetics-10-00190-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/22dc07be0185/biomimetics-10-00190-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/f7253186d052/biomimetics-10-00190-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/de4bdd9480e0/biomimetics-10-00190-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/c08dc1af307a/biomimetics-10-00190-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/72d3a80135f2/biomimetics-10-00190-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/851609b263ac/biomimetics-10-00190-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/246877604dfd/biomimetics-10-00190-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/659716d4c759/biomimetics-10-00190-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/a3357234bc99/biomimetics-10-00190-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/741c454f2802/biomimetics-10-00190-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/e608a0932dd9/biomimetics-10-00190-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/826a18b4bad7/biomimetics-10-00190-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/f1fd/11939925/22d887828698/biomimetics-10-00190-g015.jpg

相似文献

1
Multi-Humanoid Robot Arm Motion Imitation and Collaboration Based on Improved Retargeting.基于改进重定向的多拟人机器人手臂运动模仿与协作
Biomimetics (Basel). 2025 Mar 19;10(3):190. doi: 10.3390/biomimetics10030190.
2
Motion Similarity Evaluation between Human and a Tri-Co Robot during Real-Time Imitation with a Trajectory Dynamic Time Warping Model.基于轨迹动态时间规整模型的实时模仿中人与三自由度机器人运动相似度评估。
Sensors (Basel). 2022 Mar 2;22(5):1968. doi: 10.3390/s22051968.
3
A Tandem Robotic Arm Inverse Kinematic Solution Based on an Improved Particle Swarm Algorithm.一种基于改进粒子群算法的串联机器人手臂逆运动学求解方法。
Front Bioeng Biotechnol. 2022 May 19;10:832829. doi: 10.3389/fbioe.2022.832829. eCollection 2022.
4
A Whole-Body Coordinated Motion Control Method for Highly Redundant Degrees of Freedom Mobile Humanoid Robots.一种用于高冗余自由度移动人形机器人的全身协调运动控制方法。
Biomimetics (Basel). 2024 Dec 16;9(12):766. doi: 10.3390/biomimetics9120766.
5
Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head- and Eye-Gaze Interface.使用无基础设施的头眼注视接口实现笛卡尔空间中的稳健机器人控制。
Sensors (Basel). 2021 Mar 5;21(5):1798. doi: 10.3390/s21051798.
6
Human arm joints reconstruction algorithm in rehabilitation therapies assisted by end-effector robotic devices.康复治疗中末端执行器机器人辅助的人类手臂关节重建算法。
J Neuroeng Rehabil. 2018 Feb 20;15(1):10. doi: 10.1186/s12984-018-0348-0.
7
Path Planning and Motion Control of Robot Dog Through Rough Terrain Based on Vision Navigation.基于视觉导航的机器人狗在复杂地形中的路径规划与运动控制
Sensors (Basel). 2024 Nov 15;24(22):7306. doi: 10.3390/s24227306.
8
Research on Human-Robot Collaboration Method for Parallel Robots Oriented to Segment Docking.面向对接分段的并联机器人人机协作方法研究。
Sensors (Basel). 2024 Mar 8;24(6):1747. doi: 10.3390/s24061747.
9
Whole-Body Dynamics for Humanoid Robot Fall Protection Trajectory Generation with Wall Support.用于具有墙壁支撑的人形机器人跌倒保护轨迹生成的全身动力学
Biomimetics (Basel). 2024 Apr 19;9(4):245. doi: 10.3390/biomimetics9040245.
10
Human-in-the-Loop Robot Control for Human-Robot Collaboration: HUMAN INTENTION ESTIMATION AND SAFE TRAJECTORY TRACKING CONTROL FOR COLLABORATIVE TASKS.用于人机协作的人工干预机器人控制:协作任务中的人类意图估计与安全轨迹跟踪控制
IEEE Control Syst. 2020 Dec;40(6):29-56. Epub 2020 Nov 16.

引用本文的文献

1
Task Scheduling of Multiple Humanoid Robot Manipulators by Using Symbolic Control.基于符号控制的多仿人机器人操纵器任务调度
Biomimetics (Basel). 2025 May 24;10(6):346. doi: 10.3390/biomimetics10060346.

本文引用的文献

1
Adaptive Gravity Compensation Framework Based on Human Upper Limb Model for Assistive Robotic Arm Extender.基于人体上肢模型的辅助机械臂延长器自适应重力补偿框架
IEEE Int Conf Rehabil Robot. 2023 Sep;2023:1-6. doi: 10.1109/ICORR58425.2023.10304690.
2
On the role and opportunities in teamwork design for advanced multi-robot search systems.论先进多机器人搜索系统团队协作设计中的作用与机遇。
Front Robot AI. 2023 Apr 13;10:1089062. doi: 10.3389/frobt.2023.1089062. eCollection 2023.
3
Imitating by Generating: Deep Generative Models for Imitation of Interactive Tasks.
通过生成进行模仿:用于模仿交互式任务的深度生成模型
Front Robot AI. 2020 Apr 16;7:47. doi: 10.3389/frobt.2020.00047. eCollection 2020.
4
OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields.OpenPose:基于部件亲和力字段的实时多人 2D 姿态估计。
IEEE Trans Pattern Anal Mach Intell. 2021 Jan;43(1):172-186. doi: 10.1109/TPAMI.2019.2929257. Epub 2020 Dec 4.