IAS-Lab, Department of Information Engineering, University of Padova, Via Gradenigo 6/B, Padova 35131, Italy.
J Neural Eng. 2020 Jul 13;17(4):046011. doi: 10.1088/1741-2552/ab9842.
Mobile Brain/Body Imaging (MoBI) frameworks allowed the research community to find evidence of cortical involvement at walking initiation and during locomotion. However, the decoding of gait patterns from brain signals remains an open challenge. The aim of this work is to propose and validate a deep learning model to decode gait phases from Electroenchephalography (EEG).
A Long-Short Term Memory (LSTM) deep neural network has been trained to deal with time-dependent information within brain signals during locomotion. The EEG signals have been preprocessed by means of Artifacts Subspace Reconstruction (ASR) and Reliable Independent Component Analysis (RELICA) to ensure that classification performance was not affected by movement-related artifacts.
The network was evaluated on the dataset of 11 healthy subjects walking on a treadmill. The proposed decoding approach shows a robust reconstruction (AUC > 90%) of gait patterns (i.e. swing and stance states) of both legs together, or of each leg independently.
Our results support for the first time the use of a memory-based deep learning classifier to decode walking activity from non-invasive brain recordings. We suggest that this classifier, exploited in real time, can be a more effective input for devices restoring locomotion in impaired people.
移动脑/体成像(MoBI)框架使研究人员能够在开始行走和行走过程中发现皮质参与的证据。然而,从脑信号解码步态模式仍然是一个开放的挑战。本工作的目的是提出并验证一种从脑电图(EEG)解码步态阶段的深度学习模型。
长短期记忆(LSTM)深度神经网络已被训练用于处理运动过程中脑信号中的时变信息。通过人工制品子空间重建(ASR)和可靠独立成分分析(RELICA)对 EEG 信号进行预处理,以确保分类性能不受运动相关伪影的影响。
该网络在 11 名健康受试者在跑步机上行走的数据集上进行了评估。所提出的解码方法显示了对双腿或每条腿的步态模式(即摆动和站立状态)的稳健重建(AUC>90%)。
我们的结果首次支持使用基于记忆的深度学习分类器从非侵入性脑记录中解码行走活动。我们建议,这种分类器可以实时利用,作为恢复受损者运动能力的设备的更有效输入。