• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于Transformer 的神经网络,用于使用足底力预测下肢外骨骼机器人的步态。

A Transformer-Based Neural Network for Gait Prediction in Lower Limb Exoskeleton Robots Using Plantar Force.

机构信息

School of Electronic and Information, Zhongyuan University of Technology, Zhengzhou 451191, China.

Research Organization of Science and Technology, Ritsumeikan University, Kusatsu 525-8577, Japan.

出版信息

Sensors (Basel). 2023 Jul 20;23(14):6547. doi: 10.3390/s23146547.

DOI:10.3390/s23146547
PMID:37514841
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10384092/
Abstract

Lower limb exoskeleton robots have shown significant research value due to their capabilities of providing assistance to wearers and improving physical motion functions. As a type of robotic technology, wearable robots are directly in contact with the wearer's limbs during operation, necessitating a high level of human-robot collaboration to ensure safety and efficacy. Furthermore, gait prediction for the wearer, which helps to compensate for sensor delays and provide references for controller design, is crucial for improving the the human-robot collaboration capability. For gait prediction, the plantar force intrinsically reflects crucial gait patterns regardless of individual differences. To be exact, the plantar force encompasses a doubled three-axis force, which varies over time concerning the two feet, which also reflects the gait patterns indistinctly. In this paper, we developed a transformer-based neural network (TFSformer) comprising convolution and variational mode decomposition (VMD) to predict bilateral hip and knee joint angles utilizing the plantar pressure. Given the distinct information contained in the temporal and the force-space dimensions of plantar pressure, the encoder uses 1D convolution to obtain the integrated features in the two dimensions. As for the decoder, it utilizes a multi-channel attention mechanism to simultaneously focus on both dimensions and a deep multi-channel attention structure to reduce the computational and memory consumption. Furthermore, VMD is applied to networks to better distinguish the trends and changes in data. The model is trained and tested on a self-constructed dataset that consists of data from 35 volunteers. The experimental results show that FTSformer reduces the mean absolute error (MAE) up to 10.83%, 15.04% and 8.05% and the mean squared error (MSE) by 20.40%, 29.90% and 12.60% compared to the CNN model, the transformer model and the CNN transformer model, respectively.

摘要

下肢外骨骼机器人因其为佩戴者提供辅助和改善身体运动功能的能力而显示出重要的研究价值。作为一种机器人技术,可穿戴机器人在操作过程中直接与佩戴者的四肢接触,因此需要高度的人机协作,以确保安全和有效。此外,对佩戴者的步态预测有助于补偿传感器延迟并为控制器设计提供参考,对于提高人机协作能力至关重要。对于步态预测,足底力本质上反映了关键的步态模式,而与个体差异无关。确切地说,足底力包含一个双倍的三轴力,它随时间变化,涉及双脚,也能反映出步态模式的不同。在本文中,我们开发了一种基于变压器的神经网络(TFSformer),该网络结合了卷积和变分模态分解(VMD),利用足底压力预测双侧髋关节和膝关节角度。考虑到足底压力在时间和力空间维度上包含的不同信息,编码器使用 1D 卷积来获取这两个维度的综合特征。对于解码器,它利用多通道注意力机制同时关注两个维度,并利用深多通道注意力结构来减少计算和内存消耗。此外,VMD 应用于网络中,以更好地区分数据的趋势和变化。该模型在一个自建的数据集上进行训练和测试,该数据集由 35 名志愿者的数据组成。实验结果表明,与 CNN 模型、Transformer 模型和 CNN-Transformer 模型相比,TFSformer 将平均绝对误差(MAE)降低了 10.83%、15.04%和 8.05%,将平均平方误差(MSE)降低了 20.40%、29.90%和 12.60%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/dbcaad5d488a/sensors-23-06547-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/84b4a22685ba/sensors-23-06547-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/1e8b0b7ce9b6/sensors-23-06547-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/bf5e8c8f7a83/sensors-23-06547-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/8f1cffe5d4d1/sensors-23-06547-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/6379393155c8/sensors-23-06547-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/4dd8c86c0164/sensors-23-06547-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/b8b46f4951b6/sensors-23-06547-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/0a7d7c3746b2/sensors-23-06547-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/f274805b6a7e/sensors-23-06547-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/ef586cac5726/sensors-23-06547-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/dbcaad5d488a/sensors-23-06547-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/84b4a22685ba/sensors-23-06547-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/1e8b0b7ce9b6/sensors-23-06547-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/bf5e8c8f7a83/sensors-23-06547-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/8f1cffe5d4d1/sensors-23-06547-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/6379393155c8/sensors-23-06547-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/4dd8c86c0164/sensors-23-06547-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/b8b46f4951b6/sensors-23-06547-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/0a7d7c3746b2/sensors-23-06547-g008.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/f274805b6a7e/sensors-23-06547-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/ef586cac5726/sensors-23-06547-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/42ee/10384092/dbcaad5d488a/sensors-23-06547-g011.jpg

相似文献

1
A Transformer-Based Neural Network for Gait Prediction in Lower Limb Exoskeleton Robots Using Plantar Force.基于Transformer 的神经网络,用于使用足底力预测下肢外骨骼机器人的步态。
Sensors (Basel). 2023 Jul 20;23(14):6547. doi: 10.3390/s23146547.
2
Design and analysis of a lower limb assistive exoskeleton robot.下肢助行外骨骼机器人的设计与分析。
Technol Health Care. 2024;32(S1):79-93. doi: 10.3233/THC-248007.
3
Estimation of the Continuous Walking Angle of Knee and Ankle (Talocrural Joint, Subtalar Joint) of a Lower-Limb Exoskeleton Robot Using a Neural Network.利用神经网络估算下肢外骨骼机器人的膝关节和踝关节(距下关节)连续行走角度。
Sensors (Basel). 2021 Apr 16;21(8):2807. doi: 10.3390/s21082807.
4
Prediction of Limb Joint Angles Based on Multi-Source Signals by GS-GRNN for Exoskeleton Wearer.基于 GS-GRNN 的多源信号预测外骨骼穿戴者肢体关节角度。
Sensors (Basel). 2020 Feb 18;20(4):1104. doi: 10.3390/s20041104.
5
Prediction of Plantar Forces During Gait Using Wearable Sensors and Deep Neural Networks.使用可穿戴传感器和深度神经网络预测步态期间的足底压力
Annu Int Conf IEEE Eng Med Biol Soc. 2019 Jul;2019:3629-3632. doi: 10.1109/EMBC.2019.8857752.
6
Design of a control framework for lower limb exoskeleton rehabilitation robot based on predictive assessment.基于预测评估的下肢外骨骼康复机器人控制框架设计。
Clin Biomech (Bristol). 2022 May;95:105660. doi: 10.1016/j.clinbiomech.2022.105660. Epub 2022 May 6.
7
Future Image Prediction of Plantar Pressure During Gait Using Spatio-temporal Transformer.基于时空转换器的步态足底压力未来图像预测
Annu Int Conf IEEE Eng Med Biol Soc. 2022 Jul;2022:3039-3042. doi: 10.1109/EMBC48229.2022.9871688.
8
Human Body Mixed Motion Pattern Recognition Method Based on Multi-Source Feature Parameter Fusion.基于多源特征参数融合的人体混合运动模式识别方法。
Sensors (Basel). 2020 Jan 18;20(2):537. doi: 10.3390/s20020537.
9
Optimization of Torque-Control Model for Quasi-Direct-Drive Knee Exoskeleton Robots Based on Regression Forecasting.基于回归预测的准直驱膝关节外骨骼机器人转矩控制模型优化。
Sensors (Basel). 2024 Feb 26;24(5):1505. doi: 10.3390/s24051505.
10
Simulation on the Effect of Gait Variability, Delays, and Inertia with Respect to Wearer Energy Savings with Exoskeleton Assistance.关于步态变异性、延迟和惯性对穿戴者借助外骨骼辅助节省能量的影响的模拟。
IEEE Int Conf Rehabil Robot. 2019 Jun;2019:506-511. doi: 10.1109/ICORR.2019.8779459.

引用本文的文献

1
A Comprehensive Understanding of Postural Tone Biomechanics: Intrinsic Stiffness, Functional Stiffness, Antagonist Coactivation, and COP Dynamics in Post-Stroke Adults.对中风后成年人姿势张力生物力学的全面理解:内在刚度、功能刚度、拮抗肌共同激活和压力中心动态
Sensors (Basel). 2025 Mar 30;25(7):2196. doi: 10.3390/s25072196.

本文引用的文献

1
Enhanced mechanisms of pooling and channel attention for deep learning feature maps.用于深度学习特征图的池化和通道注意力增强机制。
PeerJ Comput Sci. 2022 Nov 21;8:e1161. doi: 10.7717/peerj-cs.1161. eCollection 2022.
2
Gait Prediction and Variable Admittance Control for Lower Limb Exoskeleton With Measurement Delay and Extended-State-Observer.带有测量延迟和扩张状态观测器的下肢外骨骼的步态预测和可变导纳控制。
IEEE Trans Neural Netw Learn Syst. 2023 Nov;34(11):8693-8706. doi: 10.1109/TNNLS.2022.3152255. Epub 2023 Oct 27.
3
Musculoskeletal modeling and humanoid control of robots based on human gait data.
基于人类步态数据的机器人肌肉骨骼建模与人形控制。
PeerJ Comput Sci. 2021 Aug 9;7:e657. doi: 10.7717/peerj-cs.657. eCollection 2021.
4
Review of control strategies for lower-limb exoskeletons to assist gait.下肢外骨骼助力行走的控制策略综述。
J Neuroeng Rehabil. 2021 Jul 27;18(1):119. doi: 10.1186/s12984-021-00906-3.
5
A Survey of Convolutional Neural Networks: Analysis, Applications, and Prospects.卷积神经网络综述:分析、应用与展望
IEEE Trans Neural Netw Learn Syst. 2022 Dec;33(12):6999-7019. doi: 10.1109/TNNLS.2021.3084827. Epub 2022 Nov 30.
6
Human Motion Intent Description Based on Bumpless Switching Mechanism for Rehabilitation Robot.基于无冲击切换机制的康复机器人人体运动意图描述。
IEEE Trans Neural Syst Rehabil Eng. 2021;29:673-682. doi: 10.1109/TNSRE.2021.3066592. Epub 2021 Mar 30.
7
Method for positioning and rehabilitation training with the ExoAtlet ® powered exoskeleton.使用ExoAtlet®动力外骨骼进行定位和康复训练的方法。
MethodsX. 2020 Mar 19;7:100849. doi: 10.1016/j.mex.2020.100849. eCollection 2020.
8
Human-in-the-loop optimization of exoskeleton assistance during walking.人在环优化外骨骼在行走时的辅助作用。
Science. 2017 Jun 23;356(6344):1280-1284. doi: 10.1126/science.aal5054.
9
Mobility Outcomes Following Five Training Sessions with a Powered Exoskeleton.使用动力外骨骼进行五次训练后的运动能力结果
Top Spinal Cord Inj Rehabil. 2015 Spring;21(2):93-9. doi: 10.1310/sci2102-93. Epub 2015 Apr 12.
10
Restoration of gait for spinal cord injury patients using HAL with intention estimator for preferable swing speed.使用带有用于优选摆动速度的意图估计器的HAL恢复脊髓损伤患者的步态。
IEEE Trans Neural Syst Rehabil Eng. 2015 Mar;23(2):308-18. doi: 10.1109/TNSRE.2014.2364618. Epub 2014 Oct 23.