School of Computer Science and Engineering, Nanyang Technological University, Singapore 639798, Singapore.
Department of Artificial Intelligence, Korea University, Seoul 02841, South Korea.
Neural Netw. 2021 Apr;136:1-10. doi: 10.1016/j.neunet.2020.12.013. Epub 2020 Dec 23.
In recent years, deep learning has emerged as a powerful tool for developing Brain-Computer Interface (BCI) systems. However, for deep learning models trained entirely on the data from a specific individual, the performance increase has only been marginal owing to the limited availability of subject-specific data. To overcome this, many transfer-based approaches have been proposed, in which deep networks are trained using pre-existing data from other subjects and evaluated on new target subjects. This mode of transfer learning however faces the challenge of substantial inter-subject variability in brain data. Addressing this, in this paper, we propose 5 schemes for adaptation of a deep convolutional neural network (CNN) based electroencephalography (EEG)-BCI system for decoding hand motor imagery (MI). Each scheme fine-tunes an extensively trained, pre-trained model and adapt it to enhance the evaluation performance on a target subject. We report the highest subject-independent performance with an average (N=54) accuracy of 84.19% (±9.98%) for two-class motor imagery, while the best accuracy on this dataset is 74.15% (±15.83%) in the literature. Further, we obtain a statistically significant improvement (p=0.005) in classification using the proposed adaptation schemes compared to the baseline subject-independent model.
近年来,深度学习已成为开发脑机接口(BCI)系统的强大工具。然而,对于完全基于特定个体数据进行训练的深度学习模型,由于可用的特定个体数据有限,其性能提升仅略有增加。为了克服这一问题,已经提出了许多基于迁移的方法,其中使用来自其他主体的预存在数据来训练深度网络,并在新的目标主体上进行评估。然而,这种迁移学习模式面临着大脑数据中存在大量主体间可变性的挑战。针对这一问题,在本文中,我们提出了 5 种基于深度卷积神经网络(CNN)的脑电(EEG)-BCI 系统的适应方案,用于解码手运动想象(MI)。每个方案都对经过广泛训练的预训练模型进行微调,并对其进行适应,以提高在目标主体上的评估性能。我们报告了最高的独立于主体的性能,对于两个类别的运动想象,平均准确率为 84.19%(±9.98%)(N=54),而文献中该数据集的最佳准确率为 74.15%(±15.83%)。此外,与独立于主体的基线模型相比,我们使用提出的适应方案在分类方面获得了统计学上显著的提高(p=0.005)。