Wu Xiaolong, Li Guangye, Jiang Shize, Wellington Scott, Liu Shengjie, Wu Zehan, Metcalfe Benjamin, Chen Liang, Zhang Dingguo
Centre for Autonomous Robotics (CENTAUR), Department of Electronic and Electrical Engineering, University of Bath, Bath, United Kingdom.
State Key Laboratory of Mechanical Systems and Vibrations, Institute of Robotics, Shanghai Jiao Tong University, Shanghai, People's Republic of China.
J Neural Eng. 2022 Apr 21;19(2). doi: 10.1088/1741-2552/ac65b1.
Brain-computer interfaces (BCIs) have the potential to bypass damaged neural pathways and restore functionality lost due to injury or disease. Approaches to decoding kinematic information are well documented; however, the decoding of kinetic information has received less attention. Additionally, the possibility of using stereo-electroencephalography (SEEG) for kinetic decoding during hand grasping tasks is still largely unknown. Thus, the objective of this paper is to demonstrate kinetic parameter decoding using SEEG in patients performing a grasping task with two different force levels under two different ascending rates.Temporal-spectral representations were studied to investigate frequency modulation under different force tasks. Then, force amplitude was decoded from SEEG recordings using multiple decoders, including a linear model, a partial least squares model, an unscented Kalman filter, and three deep learning models (shallow convolutional neural network, deep convolutional neural network and the proposed CNN+RNN neural network).The current study showed that: (a) for some channel, both low-frequency modulation (event-related desynchronization (ERD)) and high-frequency modulation (event-related synchronization) were sustained during prolonged force holding periods; (b) continuously changing grasp force can be decoded from the SEEG signals; (c) the novel CNN+RNN deep learning model achieved the best decoding performance, with the predicted force magnitude closely aligned to the ground truth under different force amplitudes and changing rates.This work verified the possibility of decoding continuously changing grasp force using SEEG recordings. The result presented in this study demonstrated the potential of SEEG recordings for future BCI application.
脑机接口(BCIs)有潜力绕过受损的神经通路,恢复因损伤或疾病而丧失的功能。运动学信息解码方法已有充分记录;然而,动力学信息的解码受到的关注较少。此外,在手部抓握任务中使用立体脑电图(SEEG)进行动力学解码的可能性在很大程度上仍不为人知。因此,本文的目的是展示在患者以两种不同的上升速率执行两种不同力水平的抓握任务时,使用SEEG进行动力学参数解码。研究了时间频谱表示,以探究不同力任务下的频率调制。然后,使用包括线性模型、偏最小二乘模型、无迹卡尔曼滤波器和三种深度学习模型(浅层卷积神经网络、深度卷积神经网络和本文提出的CNN+RNN神经网络)的多个解码器,从SEEG记录中解码力幅值。当前研究表明:(a)对于某些通道,在长时间的力保持期内,低频调制(事件相关去同步化(ERD))和高频调制(事件相关同步化)均持续存在;(b)可以从SEEG信号中解码出持续变化的抓握力;(c)新型的CNN+RNN深度学习模型实现了最佳解码性能,在不同力幅值和变化速率下,预测的力大小与实际值紧密匹配。这项工作验证了使用SEEG记录解码持续变化的抓握力的可能性。本研究结果证明了SEEG记录在未来脑机接口应用中的潜力。