• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

用于可穿戴外骨骼机器人的人体运动活动识别的多变量卷积神经网络模型

Multivariate CNN Model for Human Locomotion Activity Recognition with a Wearable Exoskeleton Robot.

作者信息

Son Chang-Sik, Kang Won-Seok

机构信息

Division of Intelligent Robot, Daegu Gyeongbuk Institute of Science & Technology (DGIST), Daegu 42988, Republic of Korea.

Department of Biomedical Science, Graduate School, Kyungpook National University, Daegu 41944, Republic of Korea.

出版信息

Bioengineering (Basel). 2023 Sep 13;10(9):1082. doi: 10.3390/bioengineering10091082.

DOI:10.3390/bioengineering10091082
PMID:37760184
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10525937/
Abstract

This study introduces a novel convolutional neural network (CNN) architecture, encompassing both single and multi-head designs, developed to identify a user's locomotion activity while using a wearable lower limb robot. Our research involved 500 healthy adult participants in an activities of daily living (ADL) space, conducted from 1 September to 30 November 2022. We collected prospective data to identify five locomotion activities (level ground walking, stair ascent/descent, and ramp ascent/descent) across three terrains: flat ground, staircase, and ramp. To evaluate the predictive capabilities of the proposed CNN architectures, we compared its performance with three other models: one CNN and two hybrid models (CNN-LSTM and LSTM-CNN). Experiments were conducted using multivariate signals of various types obtained from electromyograms (EMGs) and the wearable robot. Our results reveal that the deeper CNN architecture significantly surpasses the performance of the three competing models. The proposed model, leveraging encoder data such as hip angles and velocities, along with postural signals such as roll, pitch, and yaw from the wearable lower limb robot, achieved superior performance with an inference speed of 1.14 s. Specifically, the F-measure performance of the proposed model reached 96.17%, compared to 90.68% for DDLMI, 94.41% for DeepConvLSTM, and 95.57% for LSTM-CNN, respectively.

摘要

本研究介绍了一种新颖的卷积神经网络(CNN)架构,包括单头和多头设计,旨在在用户使用可穿戴下肢机器人时识别其运动活动。我们的研究在2022年9月1日至11月30日期间,在日常生活活动(ADL)空间中纳入了500名健康成年参与者。我们收集前瞻性数据,以识别三种地形(平地、楼梯和斜坡)上的五种运动活动(平地行走、上楼梯/下楼梯以及上斜坡/下斜坡)。为了评估所提出的CNN架构的预测能力,我们将其性能与其他三种模型进行了比较:一种CNN模型和两种混合模型(CNN-LSTM和LSTM-CNN)。实验使用从肌电图(EMG)和可穿戴机器人获得的各种类型的多变量信号进行。我们的结果表明,更深层次的CNN架构显著超越了三种竞争模型的性能。所提出的模型利用来自可穿戴下肢机器人的诸如髋关节角度和速度等编码器数据,以及诸如横滚、俯仰和偏航等姿势信号,以1.14秒的推理速度实现了卓越的性能。具体而言,所提出模型的F值性能达到了96.17%,而DDLMI为90.68%,DeepConvLSTM为94.41%,LSTM-CNN为95.57%。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/a912ffccb881/bioengineering-10-01082-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/3f72b7fe35cc/bioengineering-10-01082-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/142938ddc56e/bioengineering-10-01082-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/d2eed09ba572/bioengineering-10-01082-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/5b1b503b97cb/bioengineering-10-01082-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/b4247f3ab85d/bioengineering-10-01082-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/b0986922263b/bioengineering-10-01082-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/8e3bc4ff02f7/bioengineering-10-01082-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/53b53524941b/bioengineering-10-01082-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/a912ffccb881/bioengineering-10-01082-g012.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/3f72b7fe35cc/bioengineering-10-01082-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/142938ddc56e/bioengineering-10-01082-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/d2eed09ba572/bioengineering-10-01082-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/5b1b503b97cb/bioengineering-10-01082-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/b4247f3ab85d/bioengineering-10-01082-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/b0986922263b/bioengineering-10-01082-g009.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/8e3bc4ff02f7/bioengineering-10-01082-g010.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/53b53524941b/bioengineering-10-01082-g011.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/cb2e/10525937/a912ffccb881/bioengineering-10-01082-g012.jpg

相似文献

1
Multivariate CNN Model for Human Locomotion Activity Recognition with a Wearable Exoskeleton Robot.用于可穿戴外骨骼机器人的人体运动活动识别的多变量卷积神经网络模型
Bioengineering (Basel). 2023 Sep 13;10(9):1082. doi: 10.3390/bioengineering10091082.
2
Locomotion Mode Recognition Algorithm Based on Gaussian Mixture Model Using IMU Sensors.基于 IMU 传感器的高斯混合模型的运动模式识别算法。
Sensors (Basel). 2021 Apr 15;21(8):2785. doi: 10.3390/s21082785.
3
Real-Time Continuous Locomotion Mode Recognition and Transition Prediction for Human With Lower Limb Exoskeleton.下肢外骨骼人体实时连续运动模式识别与转换预测
IEEE J Biomed Health Inform. 2025 Feb;29(2):1074-1086. doi: 10.1109/JBHI.2024.3462826. Epub 2025 Feb 10.
4
Estimation of Muscle Forces of Lower Limbs Based on CNN-LSTM Neural Network and Wearable Sensor System.基于 CNN-LSTM 神经网络和可穿戴传感器系统的下肢肌肉力估计。
Sensors (Basel). 2024 Feb 5;24(3):1032. doi: 10.3390/s24031032.
5
A SE-DenseNet-LSTM model for locomotion mode recognition in lower limb exoskeleton.一种用于下肢外骨骼运动模式识别的SE-DenseNet-LSTM模型。
PeerJ Comput Sci. 2024 Feb 29;10:e1881. doi: 10.7717/peerj-cs.1881. eCollection 2024.
6
A Light-Weight Artificial Neural Network for Recognition of Activities of Daily Living.一种用于日常生活活动识别的轻量级人工神经网络。
Sensors (Basel). 2023 Jun 24;23(13):5854. doi: 10.3390/s23135854.
7
A Novel Gait Phase Recognition Method Based on DPF-LSTM-CNN Using Wearable Inertial Sensors.基于穿戴式惯性传感器的 DPF-LSTM-CNN 的新型步态相位识别方法。
Sensors (Basel). 2023 Jun 26;23(13):5905. doi: 10.3390/s23135905.
8
Comparison of machine learning and deep learning-based methods for locomotion mode recognition using a single inertial measurement unit.使用单个惯性测量单元的基于机器学习和深度学习的运动模式识别方法比较
Front Neurorobot. 2022 Nov 29;16:923164. doi: 10.3389/fnbot.2022.923164. eCollection 2022.
9
StairNet: visual recognition of stairs for human-robot locomotion.StairNet:用于人机运动的楼梯视觉识别。
Biomed Eng Online. 2024 Feb 15;23(1):20. doi: 10.1186/s12938-024-01216-0.
10
BioMAT: An Open-Source Biomechanics Multi-Activity Transformer for Joint Kinematic Predictions Using Wearable Sensors.BioMAT:一种开源的生物力学多活动转换器,用于使用可穿戴传感器进行关节运动学预测。
Sensors (Basel). 2023 Jun 21;23(13):5778. doi: 10.3390/s23135778.

本文引用的文献

1
A multi-scale feature extraction fusion model for human activity recognition.用于人体活动识别的多尺度特征提取融合模型。
Sci Rep. 2022 Nov 30;12(1):20620. doi: 10.1038/s41598-022-24887-y.
2
Human Activity Recognition via Hybrid Deep Learning Based Model.基于混合深度学习的人体活动识别。
Sensors (Basel). 2022 Jan 1;22(1):323. doi: 10.3390/s22010323.
3
Real-Time Hierarchical Classification of Time Series Data for Locomotion Mode Detection.用于运动模式检测的时间序列数据的实时分层分类
IEEE J Biomed Health Inform. 2022 Apr;26(4):1749-1760. doi: 10.1109/JBHI.2021.3106110. Epub 2022 Apr 14.
4
Lower limb kinematic, kinetic, and EMG data from young healthy humans during walking at controlled speeds.在受控速度下行走时,年轻健康人类下肢运动学、动力学和肌电图数据。
Sci Data. 2021 Apr 12;8(1):103. doi: 10.1038/s41597-021-00881-3.
5
Fusion of Bilateral Lower-Limb Neuromechanical Signals Improves Prediction of Locomotor Activities.双侧下肢神经力学信号融合改善运动活动预测
Front Robot AI. 2018 Jun 26;5:78. doi: 10.3389/frobt.2018.00078. eCollection 2018.
6
A Quantitative Comparison of Overlapping and Non-Overlapping Sliding Windows for Human Activity Recognition Using Inertial Sensors.基于惯性传感器的人体活动识别中重叠和非重叠滑动窗口的定量比较
Sensors (Basel). 2019 Nov 18;19(22):5026. doi: 10.3390/s19225026.
7
A soft robotic exosuit improves walking in patients after stroke.软性机器人外骨骼可改善中风后患者的行走能力。
Sci Transl Med. 2017 Jul 26;9(400). doi: 10.1126/scitranslmed.aai9084.
8
Human-in-the-loop optimization of exoskeleton assistance during walking.人在环优化外骨骼在行走时的辅助作用。
Science. 2017 Jun 23;356(6344):1280-1284. doi: 10.1126/science.aal5054.
9
Robot-assisted gait training for stroke patients: current state of the art and perspectives of robotics.中风患者的机器人辅助步态训练:当前技术水平及机器人技术的前景
Neuropsychiatr Dis Treat. 2017 May 15;13:1303-1311. doi: 10.2147/NDT.S114102. eCollection 2017.
10
Deep Convolutional and LSTM Recurrent Neural Networks for Multimodal Wearable Activity Recognition.用于多模态可穿戴活动识别的深度卷积和长短期记忆循环神经网络
Sensors (Basel). 2016 Jan 18;16(1):115. doi: 10.3390/s16010115.