• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

利用长短期记忆网络(LSTM)、触觉传感器和触觉反馈,通过抓握类型预测和抓握类型反馈来增强假肢控制。

Leveraging LSTM, tactile sensors, and haptic feedback to augment prosthetic control via grasp type prediction and grasp type feedback.

作者信息

Zhuwawu Sudhir Solomon, El-Hussieny Haitham

机构信息

Department of Mechatronics and Robotics Engineering, Egypt-Japan University of Science and Technology (E-JUST), New Burg El-Arab, Alexandria, 21934, Egypt.

出版信息

Sci Rep. 2025 Aug 14;15(1):29781. doi: 10.1038/s41598-025-92651-z.

DOI:10.1038/s41598-025-92651-z
PMID:40804118
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC12350629/
Abstract

Performing grasping tasks with prosthetic hands is often a slow and clumsy affair, requiring heavy reliance on visual feedback, greatly limiting the use of prosthetic hands in daily life activities. Automating the grasping tasks via machine learning models has emerged as a promising solution. However, these methods diminish user control transforming the prosthetic hand into more of a tool than a natural extension of the body. Alternatively, this work presents a method to predict and provide haptic feedback on the prosthetic hand's current grasp, aiming to aid user decision-making and control without relying on visual cues. Soft tactile sensors and deep learning models recognize the prosthetic hand's grasp type, which is conveyed to the user through a unique haptic stimulation pattern. Long Short-Term Memory (LSTM) networks were employed for the prediction, trained on a diverse dataset of five everyday grasp types. Real-world tests using prosthetic and human hands demonstrate the approach's practicality, with confident predictions made in under one second, achieving average accuracies of 88.68% and 86.44%, respectively. The proposed approach prioritizes user control, providing real-time grasp feedback with the goal of fostering a stronger sense of embodiment despite potentially reduced grasping accuracy compared to previous approaches emphasizing automation.

摘要

用假肢手执行抓握任务通常是一件缓慢且笨拙的事情,需要严重依赖视觉反馈,这极大地限制了假肢手在日常生活活动中的使用。通过机器学习模型实现抓握任务的自动化已成为一种很有前景的解决方案。然而,这些方法减少了用户控制,使假肢手更像是一种工具而非身体的自然延伸。另外,这项工作提出了一种方法,用于预测并提供关于假肢手当前抓握情况的触觉反馈,旨在帮助用户在不依赖视觉线索的情况下进行决策和控制。柔软的触觉传感器和深度学习模型识别假肢手的抓握类型,并通过独特的触觉刺激模式传达给用户。使用长短期记忆(LSTM)网络进行预测,该网络在包含五种日常抓握类型的多样化数据集上进行训练。使用假肢手和人手进行的实际测试证明了该方法的实用性,能在不到一秒的时间内做出可靠预测,平均准确率分别达到88.68%和86.44%。与之前强调自动化的方法相比,尽管抓握准确率可能有所降低,但所提出的方法优先考虑用户控制,提供实时抓握反馈,目的是增强身体感知。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/4ed58341b338/41598_2025_92651_Fig22_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/c7569e2e7055/41598_2025_92651_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/913a25cb5659/41598_2025_92651_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/766abe8023a1/41598_2025_92651_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/637381d508bd/41598_2025_92651_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/c0af11880a9e/41598_2025_92651_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/1265f6f3c21a/41598_2025_92651_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/8bf5bb9c098f/41598_2025_92651_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/48b06657e3d2/41598_2025_92651_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/d972f4b78732/41598_2025_92651_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/13648e9d4f15/41598_2025_92651_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/390973416939/41598_2025_92651_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/6f3c93101517/41598_2025_92651_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/21f34ae8a4fd/41598_2025_92651_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/292cb811de0b/41598_2025_92651_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/85e2f9652741/41598_2025_92651_Fig15_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/57ac64ccbc48/41598_2025_92651_Fig16_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/a9c65f875240/41598_2025_92651_Fig17_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/cbaf42a91a51/41598_2025_92651_Fig18_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/8bd13980353e/41598_2025_92651_Fig19_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/efe34ed3ac81/41598_2025_92651_Fig20_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/70826c6e0dc3/41598_2025_92651_Fig21_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/4ed58341b338/41598_2025_92651_Fig22_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/c7569e2e7055/41598_2025_92651_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/913a25cb5659/41598_2025_92651_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/766abe8023a1/41598_2025_92651_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/637381d508bd/41598_2025_92651_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/c0af11880a9e/41598_2025_92651_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/1265f6f3c21a/41598_2025_92651_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/8bf5bb9c098f/41598_2025_92651_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/48b06657e3d2/41598_2025_92651_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/d972f4b78732/41598_2025_92651_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/13648e9d4f15/41598_2025_92651_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/390973416939/41598_2025_92651_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/6f3c93101517/41598_2025_92651_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/21f34ae8a4fd/41598_2025_92651_Fig13_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/292cb811de0b/41598_2025_92651_Fig14_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/85e2f9652741/41598_2025_92651_Fig15_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/57ac64ccbc48/41598_2025_92651_Fig16_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/a9c65f875240/41598_2025_92651_Fig17_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/cbaf42a91a51/41598_2025_92651_Fig18_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/8bd13980353e/41598_2025_92651_Fig19_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/efe34ed3ac81/41598_2025_92651_Fig20_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/70826c6e0dc3/41598_2025_92651_Fig21_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a1f4/12350629/4ed58341b338/41598_2025_92651_Fig22_HTML.jpg

相似文献

1
Leveraging LSTM, tactile sensors, and haptic feedback to augment prosthetic control via grasp type prediction and grasp type feedback.利用长短期记忆网络(LSTM)、触觉传感器和触觉反馈,通过抓握类型预测和抓握类型反馈来增强假肢控制。
Sci Rep. 2025 Aug 14;15(1):29781. doi: 10.1038/s41598-025-92651-z.
2
Prescription of Controlled Substances: Benefits and Risks管制药品的处方:益处与风险
3
Short-Term Memory Impairment短期记忆障碍
4
HADAR Hand: 13-DoF Hybrid Actuation-Based Dextrous Anthropomorphic Robotic Hand.哈达手部:基于13自由度混合驱动的灵巧拟人机器人手。
IEEE Int Conf Rehabil Robot. 2025 May;2025:712-717. doi: 10.1109/ICORR66766.2025.11062954.
5
Comparison of Two Modern Survival Prediction Tools, SORG-MLA and METSSS, in Patients With Symptomatic Long-bone Metastases Who Underwent Local Treatment With Surgery Followed by Radiotherapy and With Radiotherapy Alone.两种现代生存预测工具 SORG-MLA 和 METSSS 在接受手术联合放疗和单纯放疗治疗有症状长骨转移患者中的比较。
Clin Orthop Relat Res. 2024 Dec 1;482(12):2193-2208. doi: 10.1097/CORR.0000000000003185. Epub 2024 Jul 23.
6
Understanding the Utility of State-Based Haptic Feedback in Tendon-Driven Anthropomorphic Prostheses.了解基于状态的触觉反馈在肌腱驱动拟人假肢中的效用。
IEEE Trans Neural Syst Rehabil Eng. 2025;33:2055-2063. doi: 10.1109/TNSRE.2025.3573871.
7
Emerging Frontiers in Robotic Upper-Limb Prostheses: Mechanisms, Materials, Tactile Sensors and Machine Learning-Based EMG Control: A Comprehensive Review.机器人上肢假肢的新兴前沿:机制、材料、触觉传感器及基于机器学习的肌电控制:综述
Sensors (Basel). 2025 Jun 22;25(13):3892. doi: 10.3390/s25133892.
8
Virtual reality-based myoelectric prosthetic control training: Effects of action observation and motor imagery with visual feedback of electromyographic signals.基于虚拟现实的肌电假肢控制训练:肌电信号视觉反馈下动作观察和运动想象的效果
Prosthet Orthot Int. 2024 Dec 18;49(4):400-407. doi: 10.1097/PXR.0000000000000392.
9
Examining the physical and psychological effects of combining multimodal feedback with continuous control in prosthetic hands.研究多模态反馈与假肢手连续控制相结合的生理和心理影响。
Sci Rep. 2025 Jan 29;15(1):3690. doi: 10.1038/s41598-025-87048-x.
10
Study on Prosthetic Hand Proprioception Feedback Based on Hybrid Vibro-Electrotactile Stimulation.基于混合振动-电触觉刺激的假手本体感觉反馈研究
IEEE Trans Neural Syst Rehabil Eng. 2025;33:2967-2976. doi: 10.1109/TNSRE.2025.3593354.

本文引用的文献

1
Haptiknit: Distributed stiffness knitting for wearable haptics.触觉编织:用于可穿戴触觉的分布式刚度编织
Sci Robot. 2024 Dec 18;9(97):eado3887. doi: 10.1126/scirobotics.ado3887.
2
Vision-aided grasp classification: design and evaluation of compact CNN for prosthetic hands.视觉辅助抓握分类:用于假肢手的紧凑 CNN 的设计与评估。
Biomed Phys Eng Express. 2024 May 15;10(4). doi: 10.1088/2057-1976/ad464e.
3
On Automated Object Grasping for Intelligent Prosthetic Hands Using Machine Learning.基于机器学习的智能假肢手自动物体抓取研究
Bioengineering (Basel). 2024 Jan 24;11(2):108. doi: 10.3390/bioengineering11020108.
4
A Haptic Sleeve as a Method of Mechanotactile Feedback Restoration for Myoelectric Hand Prosthesis Users.一种作为肌电假手使用者机械触觉反馈恢复方法的触觉套筒
Front Rehabil Sci. 2022 Apr 25;3:806479. doi: 10.3389/fresc.2022.806479. eCollection 2022.
5
Machine-Learning-Based Fine Tuning of Input Signals for Mechano-Tactile Display.基于机器学习的触觉显示输入信号的精细调整。
Sensors (Basel). 2022 Jul 15;22(14):5299. doi: 10.3390/s22145299.
6
Preliminary Evaluation of the Effect of Mechanotactile Feedback Location on Myoelectric Prosthesis Performance Using a Sensorized Prosthetic Hand.使用带传感器的假肢手初步评估触觉反馈位置对肌电假肢性能的影响。
Sensors (Basel). 2022 May 21;22(10):3892. doi: 10.3390/s22103892.
7
Multichannel haptic feedback unlocks prosthetic hand dexterity.多通道触觉反馈解锁假肢手的灵巧度。
Sci Rep. 2022 Feb 11;12(1):2323. doi: 10.1038/s41598-022-04953-1.
8
Grasping Embodiment: Haptic Feedback for Artificial Limbs.把握具身性:人工肢体的触觉反馈
Front Neurorobot. 2021 May 26;15:662397. doi: 10.3389/fnbot.2021.662397. eCollection 2021.
9
Computer Vision-Based Grasp Pattern Recognition With Application to Myoelectric Control of Dexterous Hand Prosthesis.基于计算机视觉的抓取模式识别及其在手灵巧假肢肌电控制中的应用。
IEEE Trans Neural Syst Rehabil Eng. 2020 Sep;28(9):2090-2099. doi: 10.1109/TNSRE.2020.3007625. Epub 2020 Jul 20.
10
Advanced technologies for intuitive control and sensation of prosthetics.用于直观控制和感受假肢的先进技术。
Biomed Eng Lett. 2019 Aug 8;10(1):119-128. doi: 10.1007/s13534-019-00127-7. eCollection 2020 Feb.