• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于角度轨迹的运动型假肢控制正日益接近自然手臂协调。

Movement-Based Prosthesis Control with Angular Trajectory Is Getting Closer to Natural Arm Coordination.

作者信息

Segas Effie, Leconte Vincent, Doat Emilie, Cattaert Daniel, de Rugy Aymar

机构信息

University of Bordeaux, CNRS, INCIA, UMR, 5287 Bordeaux, France.

出版信息

Biomimetics (Basel). 2024 Sep 4;9(9):532. doi: 10.3390/biomimetics9090532.

DOI:10.3390/biomimetics9090532
PMID:39329554
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11430227/
Abstract

Traditional myoelectric controls of trans-humeral prostheses fail to provide intuitive coordination of the necessary degrees of freedom. We previously showed that by using artificial neural network predictions to reconstruct distal joints, based on the shoulder posture and movement goals (i.e., position and orientation of the targeted object), participants were able to position and orient an avatar hand to grasp objects with natural arm performances. However, this control involved rapid and unintended prosthesis movements at each modification of the movement goal, impractical for real-life scenarios. Here, we eliminate this abrupt change using novel methods based on an angular trajectory, determined from the speed of stump movement and the gap between the current and the 'goal' distal configurations. These new controls are tested offline and online (i.e., involving participants-in-the-loop) and compared to performances obtained with a natural control. Despite a slight increase in movement time, the new controls allowed twelve valid participants and six participants with trans-humeral limb loss to reach objects at various positions and orientations without prior training. Furthermore, no usability or workload degradation was perceived by participants with upper limb disabilities. The good performances achieved highlight the potential acceptability and effectiveness of those controls for our target population.

摘要

传统的经肱骨假肢肌电控制无法实现对必要自由度的直观协调。我们之前表明,通过使用人工神经网络预测,基于肩部姿势和运动目标(即目标物体的位置和方向)来重建远端关节,参与者能够以自然的手臂动作定位和定向虚拟手来抓取物体。然而,这种控制在每次运动目标改变时都会导致假肢快速且意外的移动,在现实场景中不实用。在此,我们基于由残肢运动速度以及当前与“目标”远端构型之间的差距所确定的角轨迹,使用新方法消除了这种突然变化。这些新控制方法经过离线和在线测试(即让参与者参与其中),并与自然控制下的表现进行比较。尽管运动时间略有增加,但新控制方法使12名健全参与者和6名经肱骨截肢参与者无需预先训练就能在不同位置和方向抓取物体。此外,上肢残疾参与者并未感觉到可用性或工作量有所下降。所取得的良好表现突出了这些控制方法对于我们目标人群的潜在可接受性和有效性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/67d5cec75bc7/biomimetics-09-00532-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/afc6ab25650e/biomimetics-09-00532-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/9195ec125c5c/biomimetics-09-00532-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/aef43f6a7890/biomimetics-09-00532-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/edcc6509c81e/biomimetics-09-00532-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/009a7fa7a34a/biomimetics-09-00532-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/67d5cec75bc7/biomimetics-09-00532-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/afc6ab25650e/biomimetics-09-00532-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/9195ec125c5c/biomimetics-09-00532-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/aef43f6a7890/biomimetics-09-00532-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/edcc6509c81e/biomimetics-09-00532-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/009a7fa7a34a/biomimetics-09-00532-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/674c/11430227/67d5cec75bc7/biomimetics-09-00532-g006.jpg

相似文献

1
Movement-Based Prosthesis Control with Angular Trajectory Is Getting Closer to Natural Arm Coordination.基于角度轨迹的运动型假肢控制正日益接近自然手臂协调。
Biomimetics (Basel). 2024 Sep 4;9(9):532. doi: 10.3390/biomimetics9090532.
2
Shoulder kinematics plus contextual target information enable control of multiple distal joints of a simulated prosthetic arm and hand.肩部运动学加上上下文目标信息可以控制模拟假肢手臂和手的多个远端关节。
J Neuroeng Rehabil. 2021 Jan 6;18(1):3. doi: 10.1186/s12984-020-00793-0.
3
Intuitive movement-based prosthesis control enables arm amputees to reach naturally in virtual reality.基于直觉的运动型假肢控制使手臂截肢者能够在虚拟现实中自然地伸展手臂。
Elife. 2023 Oct 17;12:RP87317. doi: 10.7554/eLife.87317.
4
Integrated control of hand transport and orientation during prehension movements.抓握动作中手部运输与定向的整合控制
Exp Brain Res. 1996 Jul;110(2):265-78. doi: 10.1007/BF00228557.
5
Postural control of three-dimensional prehension movements.三维抓握动作的姿势控制
J Neurophysiol. 1997 Jan;77(1):452-64. doi: 10.1152/jn.1997.77.1.452.
6
Comparison of range-of-motion and variability in upper body movements between transradial prosthesis users and able-bodied controls when executing goal-oriented tasks.经桡动脉假肢使用者与健全对照者在执行目标导向任务时上肢运动的活动范围和变异性比较。
J Neuroeng Rehabil. 2014 Sep 6;11:132. doi: 10.1186/1743-0003-11-132.
7
Deep learning-based artificial vision for grasp classification in myoelectric hands.基于深度学习的人工视觉用于肌电手中的抓握分类
J Neural Eng. 2017 Jun;14(3):036025. doi: 10.1088/1741-2552/aa6802. Epub 2017 May 3.
8
Movement quality of conventional prostheses and the DEKA Arm during everyday tasks.日常任务中传统假肢和DEKA义肢的运动质量。
Prosthet Orthot Int. 2017 Feb;41(1):33-40. doi: 10.1177/0309364616631348. Epub 2016 Jul 9.
9
Modifying upper-limb inter-joint coordination in healthy subjects by training with a robotic exoskeleton.通过使用机器人外骨骼进行训练来改变健康受试者上肢关节间的协调性。
J Neuroeng Rehabil. 2017 Jun 12;14(1):55. doi: 10.1186/s12984-017-0254-x.
10
Movement-Based Control for Upper-Limb Prosthetics: Is the Regression Technique the Key to a Robust and Accurate Control?上肢假肢的基于运动的控制:回归技术是实现稳健且精确控制的关键吗?
Front Neurorobot. 2018 Jul 26;12:41. doi: 10.3389/fnbot.2018.00041. eCollection 2018.

本文引用的文献

1
3D-ARM-Gaze: a public dataset of 3D Arm Reaching Movements with Gaze information in virtual reality.3D-ARM-Gaze:虚拟现实中具有注视信息的 3D 手臂运动的公共数据集。
Sci Data. 2024 Aug 30;11(1):951. doi: 10.1038/s41597-024-03765-4.
2
Intuitive movement-based prosthesis control enables arm amputees to reach naturally in virtual reality.基于直觉的运动型假肢控制使手臂截肢者能够在虚拟现实中自然地伸展手臂。
Elife. 2023 Oct 17;12:RP87317. doi: 10.7554/eLife.87317.
3
A tool for measuring mental workload during prosthesis use: The Prosthesis Task Load Index (PROS-TLX).
一种用于测量假肢使用过程中心理工作量的工具:假肢任务负荷指数(PROS-TLX)。
PLoS One. 2023 May 4;18(5):e0285382. doi: 10.1371/journal.pone.0285382. eCollection 2023.
4
Trajectory Control-An Effective Strategy for Controlling Multi-DOF Upper Limb Prosthetic Devices.轨迹控制——一种控制多自由度上肢假肢设备的有效策略。
IEEE Trans Neural Syst Rehabil Eng. 2022;30:420-430. doi: 10.1109/TNSRE.2022.3151055. Epub 2022 Feb 23.
5
Shoulder kinematics plus contextual target information enable control of multiple distal joints of a simulated prosthetic arm and hand.肩部运动学加上上下文目标信息可以控制模拟假肢手臂和手的多个远端关节。
J Neuroeng Rehabil. 2021 Jan 6;18(1):3. doi: 10.1186/s12984-020-00793-0.
6
Task-Space Synergies for Reaching Using Upper-Limb Prostheses.利用上肢假肢进行达的任务空间协同。
IEEE Trans Neural Syst Rehabil Eng. 2020 Dec;28(12):2966-2977. doi: 10.1109/TNSRE.2020.3036320. Epub 2021 Jan 28.
7
Reachy, a 3D-Printed Human-Like Robotic Arm as a Testbed for Human-Robot Control Strategies.Reachy,一款3D打印的类人机器人手臂,作为人机控制策略的试验平台。
Front Neurorobot. 2019 Aug 14;13:65. doi: 10.3389/fnbot.2019.00065. eCollection 2019.
8
New developments in prosthetic arm systems.假肢手臂系统的新进展。
Orthop Res Rev. 2016 Jul 7;8:31-39. doi: 10.2147/ORR.S71468. eCollection 2016.
9
Movement-Based Control for Upper-Limb Prosthetics: Is the Regression Technique the Key to a Robust and Accurate Control?上肢假肢的基于运动的控制:回归技术是实现稳健且精确控制的关键吗?
Front Neurorobot. 2018 Jul 26;12:41. doi: 10.3389/fnbot.2018.00041. eCollection 2018.
10
Synergistic Elbow Control for a Myoelectric Transhumeral Prosthesis.协同肘控制的肌电异肢。
IEEE Trans Neural Syst Rehabil Eng. 2018 Feb;26(2):468-476. doi: 10.1109/TNSRE.2017.2781719.