Suppr超能文献

脑与语言的语义对齐:一种用于脑电文本生成的课程对比方法。

Aligning Semantic in Brain and Language: A Curriculum Contrastive Method for Electroencephalography-to-Text Generation.

出版信息

IEEE Trans Neural Syst Rehabil Eng. 2023;31:3874-3883. doi: 10.1109/TNSRE.2023.3314642. Epub 2023 Oct 9.

Abstract

Electroencephalography-to-Text generation (EEG-to-Text), which aims to directly generate natural text from EEG signals has drawn increasing attention in recent years due to the enormous potential for Brain-computer interfaces. However, the remarkable discrepancy between the subject-dependent EEG representation and the semantic-dependent text representation poses a great challenge to this task. To mitigate this, we devise a Curriculum Semantic-aware Contrastive Learning strategy (C- SCL), which effectively recalibrates the subject-dependent EEG representation to the semantic-dependent EEG representation, thereby reducing the discrepancy. Specifically, our C- SCL pulls semantically similar EEG representations together while pushing apart dissimilar ones. Besides, in order to introduce more meaningful contrastive pairs, we carefully employ curriculum learning to not only craft meaningful contrastive pairs but also make the learning progressively. We conduct extensive experiments on the ZuCo benchmark and our method combined with diverse models and architectures shows stable improvements across three types of metrics while achieving the new state-of-the-art. Further investigation proves not only its superiority in both the single-subject and low-resource settings but also its robust generalizability in the zero-shot setting. Our codes are available at: https://github.com/xcfcode/contrastive_eeg2text.

摘要

脑电文本生成(EEG-to-Text)旨在直接从脑电信号生成自然文本,近年来由于脑机接口的巨大潜力而引起了越来越多的关注。然而,由于EEG 表示与文本表示之间存在显著的差异,因此该任务极具挑战性。为了解决这个问题,我们设计了一种课程语义感知对比学习策略(C-SCL),可以有效地将依赖于主体的 EEG 表示重新校准到依赖于语义的 EEG 表示,从而减少差异。具体来说,我们的 C-SCL 将语义相似的 EEG 表示拉到一起,同时将不相似的 EEG 表示推开。此外,为了引入更有意义的对比对,我们谨慎地使用课程学习,不仅可以制作有意义的对比对,而且可以使学习逐渐进行。我们在 ZuCo 基准上进行了广泛的实验,我们的方法与多种模型和架构相结合,在三种类型的指标上都表现出了稳定的提高,达到了新的技术水平。进一步的研究证明了它不仅在单主体和低资源设置下具有优越性,而且在零样本设置下具有强大的泛化能力。我们的代码可在:https://github.com/xcfcode/contrastive_eeg2text。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验