• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

分割、比较与学习:创建用于示范学习的复杂任务运动库。

Segment, Compare, and Learn: Creating Movement Libraries of Complex Task for Learning from Demonstration.

作者信息

Prados Adrian, Espinoza Gonzalo, Moreno Luis, Barber Ramon

机构信息

RoboticsLab, Universidad Carlos III de Madrid, 28911 Madrid, Spain.

出版信息

Biomimetics (Basel). 2025 Jan 17;10(1):64. doi: 10.3390/biomimetics10010064.

DOI:10.3390/biomimetics10010064
PMID:39851780
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11759161/
Abstract

Motion primitives are a highly useful and widely employed tool in the field of Learning from Demonstration (LfD). However, obtaining a large number of motion primitives can be a tedious process, as they typically need to be generated individually for each task to be learned. To address this challenge, this work presents an algorithm for acquiring robotic skills through automatic and unsupervised segmentation. The algorithm divides tasks into simpler subtasks and generates motion primitive libraries that group common subtasks for use in subsequent learning processes. Our algorithm is based on an initial segmentation step using a heuristic method, followed by probabilistic clustering with Gaussian Mixture Models. Once the segments are obtained, they are grouped using Gaussian Optimal Transport on the Gaussian Processes (GPs) of each segment group, comparing their similarities through the energy cost of transforming one GP into another. This process requires no prior knowledge, it is entirely autonomous, and supports multimodal information. The algorithm enables generating trajectories suitable for robotic tasks, establishing simple primitives that encapsulate the structure of the movements to be performed. Its effectiveness has been validated in manipulation tasks with a real robot, as well as through comparisons with state-of-the-art algorithms.

摘要

运动基元是示范学习(LfD)领域中一种非常有用且广泛应用的工具。然而,获取大量运动基元可能是一个繁琐的过程,因为通常需要为每个要学习的任务单独生成它们。为应对这一挑战,这项工作提出了一种通过自动和无监督分割来获取机器人技能的算法。该算法将任务分解为更简单的子任务,并生成运动基元库,将常见子任务分组以供后续学习过程使用。我们的算法基于使用启发式方法的初始分割步骤,随后使用高斯混合模型进行概率聚类。一旦获得片段,就使用高斯最优传输对每个片段组的高斯过程(GPs)进行分组,通过将一个高斯过程转换为另一个高斯过程的能量成本来比较它们的相似性。这个过程不需要先验知识,完全是自主的,并且支持多模态信息。该算法能够生成适合机器人任务的轨迹,建立封装要执行运动结构的简单基元。它的有效性已在真实机器人的操作任务中得到验证,以及通过与最先进算法的比较得到验证。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/0829535d74d6/biomimetics-10-00064-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/94cd069c295a/biomimetics-10-00064-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/2433af8544c7/biomimetics-10-00064-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/f6564ca9a9f3/biomimetics-10-00064-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/ccd01cea3ec7/biomimetics-10-00064-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/3447f31012bd/biomimetics-10-00064-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/3dd42190cbaa/biomimetics-10-00064-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/126a36d983dc/biomimetics-10-00064-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/b3482fe2e0f1/biomimetics-10-00064-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/66cf730e6df1/biomimetics-10-00064-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/3579ee9a7152/biomimetics-10-00064-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/da495ffa7864/biomimetics-10-00064-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/07364b506d05/biomimetics-10-00064-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/55ba47ccc167/biomimetics-10-00064-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/0829535d74d6/biomimetics-10-00064-g014.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/94cd069c295a/biomimetics-10-00064-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/2433af8544c7/biomimetics-10-00064-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/f6564ca9a9f3/biomimetics-10-00064-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/ccd01cea3ec7/biomimetics-10-00064-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/3447f31012bd/biomimetics-10-00064-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/3dd42190cbaa/biomimetics-10-00064-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/126a36d983dc/biomimetics-10-00064-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/b3482fe2e0f1/biomimetics-10-00064-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/66cf730e6df1/biomimetics-10-00064-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/3579ee9a7152/biomimetics-10-00064-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/da495ffa7864/biomimetics-10-00064-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/07364b506d05/biomimetics-10-00064-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/55ba47ccc167/biomimetics-10-00064-g013.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/6779/11759161/0829535d74d6/biomimetics-10-00064-g014.jpg

相似文献

1
Segment, Compare, and Learn: Creating Movement Libraries of Complex Task for Learning from Demonstration.分割、比较与学习:创建用于示范学习的复杂任务运动库。
Biomimetics (Basel). 2025 Jan 17;10(1):64. doi: 10.3390/biomimetics10010064.
2
A Task-Learning Strategy for Robotic Assembly Tasks from Human Demonstrations.一种基于人类示教的机器人装配任务学习策略。
Sensors (Basel). 2020 Sep 25;20(19):5505. doi: 10.3390/s20195505.
3
Robot complex motion learning based on unsupervised trajectory segmentation and movement primitives.基于无监督轨迹分割和运动基元的机器人复杂运动学习。
ISA Trans. 2020 Feb;97:325-335. doi: 10.1016/j.isatra.2019.08.007. Epub 2019 Aug 5.
4
Peg-in-hole assembly skill imitation learning method based on ProMPs under task geometric representation.基于任务几何表示的ProMPs的插销入孔装配技能模仿学习方法
Front Neurorobot. 2023 Nov 9;17:1320251. doi: 10.3389/fnbot.2023.1320251. eCollection 2023.
5
Guided Stochastic Optimization for Motion Planning.用于运动规划的引导式随机优化
Front Robot AI. 2019 Nov 12;6:105. doi: 10.3389/frobt.2019.00105. eCollection 2019.
6
ASAP-CORPS: A Semi-Autonomous Platform for COntact-Rich Precision Surgery.ASAP-CORPS:一种用于接触丰富的精准手术的半自主平台。
Mil Med. 2023 Nov 8;188(Suppl 6):412-419. doi: 10.1093/milmed/usad175.
7
A novel trajectory learning method for robotic arms based on Gaussian Mixture Model and k-value selection algorithm.一种基于高斯混合模型和k值选择算法的新型机器人手臂轨迹学习方法。
PLoS One. 2025 Feb 14;20(2):e0318403. doi: 10.1371/journal.pone.0318403. eCollection 2025.
8
Robot Task-Constrained Optimization and Adaptation with Probabilistic Movement Primitives.基于概率运动基元的机器人任务约束优化与自适应
Biomimetics (Basel). 2024 Dec 3;9(12):738. doi: 10.3390/biomimetics9120738.
9
Learned parametrized dynamic movement primitives with shared synergies for controlling robotic and musculoskeletal systems.基于共享协同的参数化动态运动基元学习,用于控制机器人和肌肉骨骼系统。
Front Comput Neurosci. 2013 Oct 17;7:138. doi: 10.3389/fncom.2013.00138. eCollection 2013.
10
Statistical modeling on motion trajectories for robotic laparoscopic surgery.机器人腹腔镜手术运动轨迹的统计建模
Annu Int Conf IEEE Eng Med Biol Soc. 2017 Jul;2017:4347-4350. doi: 10.1109/EMBC.2017.8037818.

本文引用的文献

1
ADAM: a robotic companion for enhanced quality of life in aging populations.ADAM:一款旨在提升老年人群生活质量的机器人伴侣。
Front Neurorobot. 2024 Feb 9;18:1337608. doi: 10.3389/fnbot.2024.1337608. eCollection 2024.
2
Knowledge-Augmented Deep Learning and its Applications: A Survey.知识增强深度学习及其应用:一项综述。
IEEE Trans Neural Netw Learn Syst. 2025 Feb;36(2):2133-2153. doi: 10.1109/TNNLS.2023.3338619. Epub 2025 Feb 6.
3
Learning for a Robot: Deep Reinforcement Learning, Imitation Learning, Transfer Learning.
机器人学习:深度强化学习、模仿学习、迁移学习。
Sensors (Basel). 2021 Feb 11;21(4):1278. doi: 10.3390/s21041278.
4
Optimal transport for Gaussian mixture models.高斯混合模型的最优传输
IEEE Access. 2018;7:6269-6278. doi: 10.1109/ACCESS.2018.2889838. Epub 2018 Dec 27.
5
Robot complex motion learning based on unsupervised trajectory segmentation and movement primitives.基于无监督轨迹分割和运动基元的机器人复杂运动学习。
ISA Trans. 2020 Feb;97:325-335. doi: 10.1016/j.isatra.2019.08.007. Epub 2019 Aug 5.
6
Unsupervised Trajectory Segmentation for Surgical Gesture Recognition in Robotic Training.用于机器人训练中手术手势识别的无监督轨迹分割
IEEE Trans Biomed Eng. 2016 Jun;63(6):1280-91. doi: 10.1109/TBME.2015.2493100. Epub 2015 Oct 26.