• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

思考者不变性:使 BCI 中的深度神经网络能够适用于更多人。

Thinker invariance: enabling deep neural networks for BCI across more people.

机构信息

University of Toronto, Vector Institute for Artificial Intelligence; Toronto, Canada.

University of Toronto, Li Ka Shing Knowledge Institute, St Michael's Hospital; Toronto Canada.

出版信息

J Neural Eng. 2020 Oct 13;17(5):056008. doi: 10.1088/1741-2552/abb7a7.

DOI:10.1088/1741-2552/abb7a7
PMID:32916675
Abstract

OBJECTIVE

Most deep neural networks (DNNs) used as brain computer interfaces (BCI) classifiers are rarely viable for more than one person and are relatively shallow compared to the state-of-the-art in the wider machine learning literature. The goal of this work is to frame these as a unified challenge and reconsider how transfer learning is used to overcome these difficulties.

APPROACH

We present two variations of a holistic approach to transfer learning with DNNs for BCI that rely on a deeper network called TIDNet. Our approaches use multiple subjects for training in the interest of creating a more universal classifier that is applicable for new (unseen) subjects. The first approach is purely subject-invariant and the second targets specific subjects, without loss of generality. We use five publicly accessible datasets covering a range of tasks and compare our approaches to state-of-the-art alternatives in detail.

MAIN RESULTS

We observe that TIDNet in conjunction with our training augmentations is more consistent when compared to shallower baselines, and in some cases exhibits large and significant improvements, for instance motor imagery classification improvements of over 8%. Furthermore, we show that our suggested multi-domain learning (MDL) strategy strongly outperforms simply fine-tuned general models when targeting specific subjects, while remaining more generalizable to still unseen subjects.

SIGNIFICANCE

TIDNet in combination with a data alignment-based training augmentation proves to be a consistent classification approach of single raw trials and can be trained even with the inclusion of corrupted trials. Our MDL strategy calls into question the intuition to fine-tune trained classifiers to new subjects, as it proves simpler and more accurate while remaining general. Furthermore, we show evidence that augmented TIDNet training makes better use of additional subjects, showing continued and greater performance improvement over shallower alternatives, indicating promise for a new subject-invariant paradigm rather than a subject-specific one.

摘要

目的

大多数用于脑机接口(BCI)分类器的深度神经网络(DNN)很少适用于超过一个人,并且与更广泛的机器学习文献中的最新技术相比相对较浅。这项工作的目标是将这些问题作为一个统一的挑战,并重新考虑如何使用迁移学习来克服这些困难。

方法

我们提出了两种用于 BCI 的基于 DNN 的整体迁移学习方法,这两种方法都依赖于一个称为 TIDNet 的更深的网络。我们的方法使用多个主题进行训练,以创建一个更通用的分类器,适用于新的(未见过的)主题。第一种方法是纯粹的主题不变,第二种方法是针对特定主题的,不失一般性。我们使用五个公开可用的数据集,涵盖了一系列任务,并详细比较了我们的方法与最新技术的替代方法。

主要结果

我们观察到,与较浅的基线相比,TIDNet 结合我们的训练增强更具一致性,在某些情况下表现出较大且显著的改进,例如运动想象分类的改进超过 8%。此外,我们表明,我们提出的多域学习(MDL)策略在针对特定主题时,明显优于简单地微调通用模型,同时仍然更具通用性,适用于仍未见过的主题。

意义

TIDNet 与基于数据对齐的训练增强相结合,被证明是一种单一原始试验的一致分类方法,甚至可以在包含损坏试验的情况下进行训练。我们的 MDL 策略对针对新主题微调训练好的分类器的直觉提出了质疑,因为它证明了更简单、更准确,同时仍然具有通用性。此外,我们还证明了增强 TIDNet 训练可以更好地利用额外的主题,与较浅的替代方案相比,持续且更大的性能提升,这表明在新的主题不变范例而不是主题特定范例方面有希望。

相似文献

1
Thinker invariance: enabling deep neural networks for BCI across more people.思考者不变性:使 BCI 中的深度神经网络能够适用于更多人。
J Neural Eng. 2020 Oct 13;17(5):056008. doi: 10.1088/1741-2552/abb7a7.
2
Transfer learning of an ensemble of DNNs for SSVEP BCI spellers without user-specific training.无需用户特定训练的用于稳态视觉诱发电位脑机接口拼写器的深度神经网络集成的迁移学习
J Neural Eng. 2023 Jan 18;20(1). doi: 10.1088/1741-2552/acacca.
3
Multi-subject classification of Motor Imagery EEG signals using transfer learning in neural networks.基于神经网络迁移学习的运动想象脑电信号多主题分类
Annu Int Conf IEEE Eng Med Biol Soc. 2021 Nov;2021:1006-1009. doi: 10.1109/EMBC46164.2021.9630155.
4
Adaptive transfer learning for EEG motor imagery classification with deep Convolutional Neural Network.基于深度卷积神经网络的 EEG 运动想象分类自适应迁移学习。
Neural Netw. 2021 Apr;136:1-10. doi: 10.1016/j.neunet.2020.12.013. Epub 2020 Dec 23.
5
Dual model transfer learning to compensate for individual variability in brain-computer interface.双重模型迁移学习补偿脑机接口中的个体变异性。
Comput Methods Programs Biomed. 2024 Sep;254:108294. doi: 10.1016/j.cmpb.2024.108294. Epub 2024 Jun 17.
6
Instance Transfer Subject-Dependent Strategy for Motor Imagery Signal Classification Using Deep Convolutional Neural Networks.基于深度卷积神经网络的运动想象信号分类的实例迁移主体相关策略。
Comput Math Methods Med. 2020 Aug 28;2020:1683013. doi: 10.1155/2020/1683013. eCollection 2020.
7
Validating Deep Neural Networks for Online Decoding of Motor Imagery Movements from EEG Signals.验证深度神经网络用于从 EEG 信号中在线解码运动想象运动。
Sensors (Basel). 2019 Jan 8;19(1):210. doi: 10.3390/s19010210.
8
Inter-subject transfer learning with an end-to-end deep convolutional neural network for EEG-based BCI.基于端到端深度卷积神经网络的 EEG 脑机接口的跨被试迁移学习。
J Neural Eng. 2019 Apr;16(2):026007. doi: 10.1088/1741-2552/aaf3f6. Epub 2018 Nov 26.
9
EEG-Inception: A Novel Deep Convolutional Neural Network for Assistive ERP-Based Brain-Computer Interfaces.EEG-Inception:一种用于基于 ERP 的辅助脑-机接口的新型深度卷积神经网络。
IEEE Trans Neural Syst Rehabil Eng. 2020 Dec;28(12):2773-2782. doi: 10.1109/TNSRE.2020.3048106. Epub 2021 Jan 28.
10
Dual stream neural networks for brain signal classification.双通道神经网络在脑信号分类中的应用。
J Neural Eng. 2021 Jan 25;18(1). doi: 10.1088/1741-2552/abc903.

引用本文的文献

1
SS-EMERGE - self-supervised enhancement for multidimension emotion recognition using GNNs for EEG.SS-EMERGE——用于脑电图的基于图神经网络的多维情感识别自监督增强方法
Sci Rep. 2025 Apr 24;15(1):14254. doi: 10.1038/s41598-025-98623-7.
2
Flexible Patched Brain Transformer model for EEG decoding.用于脑电图解码的灵活修补脑变压器模型。
Sci Rep. 2025 Mar 29;15(1):10935. doi: 10.1038/s41598-025-86294-3.
3
How different immersive environments affect intracortical brain computer interfaces.不同的沉浸式环境如何影响皮层内脑机接口。
J Neural Eng. 2025 Feb 10;22(1):016032. doi: 10.1088/1741-2552/adb078.
4
Group-level brain decoding with deep learning.基于深度学习的组水平脑解码。
Hum Brain Mapp. 2023 Dec 1;44(17):6105-6119. doi: 10.1002/hbm.26500. Epub 2023 Sep 27.
5
Convolutional Neural Network with a Topographic Representation Module for EEG-Based Brain-Computer Interfaces.具有地形表示模块的卷积神经网络用于基于脑电图的脑机接口
Brain Sci. 2023 Feb 5;13(2):268. doi: 10.3390/brainsci13020268.
6
Subject Separation Network for Reducing Calibration Time of MI-Based BCI.用于减少基于运动想象的脑机接口校准时间的主题分离网络
Brain Sci. 2023 Jan 28;13(2):221. doi: 10.3390/brainsci13020221.
7
2020 International brain-computer interface competition: A review.2020年国际脑机接口竞赛综述
Front Hum Neurosci. 2022 Jul 22;16:898300. doi: 10.3389/fnhum.2022.898300. eCollection 2022.
8
[Parameter transfer learning based on shallow visual geometry group network and its application in motor imagery classification].基于浅层视觉几何组网络的参数迁移学习及其在运动想象分类中的应用
Sheng Wu Yi Xue Gong Cheng Xue Za Zhi. 2022 Feb 25;39(1):28-38. doi: 10.7507/1001-5515.202108060.
9
BENDR: Using Transformers and a Contrastive Self-Supervised Learning Task to Learn From Massive Amounts of EEG Data.BENDR:使用Transformer和对比自监督学习任务从大量脑电图数据中学习。
Front Hum Neurosci. 2021 Jun 23;15:653659. doi: 10.3389/fnhum.2021.653659. eCollection 2021.
10
A Survey on Deep Learning-Based Short/Zero-Calibration Approaches for EEG-Based Brain-Computer Interfaces.基于深度学习的脑电图脑机接口短/零校准方法综述。
Front Hum Neurosci. 2021 May 28;15:643386. doi: 10.3389/fnhum.2021.643386. eCollection 2021.