• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

设计测量言语神经相关指标的方案时的机遇、陷阱与权衡。

Opportunities, pitfalls and trade-offs in designing protocols for measuring the neural correlates of speech.

机构信息

Intelligent Systems Research Centre, Ulster University, Derry, UK.

Institute for Research in Social Sciences, Ulster University, Jordanstown, UK.

出版信息

Neurosci Biobehav Rev. 2022 Sep;140:104783. doi: 10.1016/j.neubiorev.2022.104783. Epub 2022 Jul 27.

DOI:10.1016/j.neubiorev.2022.104783
PMID:35907491
Abstract

Decoding speech and speech-related processes directly from the human brain has intensified in studies over recent years as such a decoder has the potential to positively impact people with limited communication capacity due to disease or injury. Additionally, it can present entirely new forms of human-computer interaction and human-machine communication in general and facilitate better neuroscientific understanding of speech processes. Here, we synthesize the literature on neural speech decoding pertaining to how speech decoding experiments have been conducted, coalescing around a necessity for thoughtful experimental design aimed at specific research goals, and robust procedures for evaluating speech decoding paradigms. We examine the use of different modalities for presenting stimuli to participants, methods for construction of paradigms including timings and speech rhythms, and possible linguistic considerations. In addition, novel methods for eliciting naturalistic speech and validating imagined speech task performance in experimental settings are presented based on recent research. We also describe the multitude of terms used to instruct participants on how to produce imagined speech during experiments and propose methods for investigating the effect of these terms on imagined speech decoding. We demonstrate that the range of experimental procedures used in neural speech decoding studies can have unintended consequences which can impact upon the efficacy of the knowledge obtained. The review delineates the strengths and weaknesses of present approaches and poses methodological advances which we anticipate will enhance experimental design, and progress toward the optimal design of movement independent direct speech brain-computer interfaces.

摘要

近年来,直接从人类大脑中解码言语和与言语相关的过程在研究中得到了加强,因为这样的解码器有可能对因疾病或受伤而沟通能力有限的人产生积极影响。此外,它可以为一般的人机交互和人机通信带来全新的形式,并促进对言语过程的更好的神经科学理解。在这里,我们综合了有关神经言语解码的文献,重点介绍了如何进行言语解码实验,围绕着针对特定研究目标的深思熟虑的实验设计和用于评估言语解码范式的稳健程序的必要性。我们研究了使用不同的模态向参与者呈现刺激的方法、包括时间安排和言语节奏的范式构建方法,以及可能的语言考虑因素。此外,根据最近的研究,提出了在实验环境中引出自然语言和验证想象中的言语任务表现的新方法。我们还描述了在实验期间指导参与者如何产生想象中的言语的多种术语,并提出了研究这些术语对想象中的言语解码的影响的方法。我们证明,神经言语解码研究中使用的实验程序范围可能会产生意外的后果,从而影响所获得知识的有效性。该综述阐述了当前方法的优缺点,并提出了我们预期将增强实验设计和朝着最优的运动独立直接言语脑机接口设计前进的方法上的进展。

相似文献

1
Opportunities, pitfalls and trade-offs in designing protocols for measuring the neural correlates of speech.设计测量言语神经相关指标的方案时的机遇、陷阱与权衡。
Neurosci Biobehav Rev. 2022 Sep;140:104783. doi: 10.1016/j.neubiorev.2022.104783. Epub 2022 Jul 27.
2
A Bimodal Deep Learning Architecture for EEG-fNIRS Decoding of Overt and Imagined Speech.一种用于 EEG-fNIRS 解码言语出声和想象的双模深度学习架构。
IEEE Trans Biomed Eng. 2022 Jun;69(6):1983-1994. doi: 10.1109/TBME.2021.3132861. Epub 2022 May 19.
3
Evaluation of Hyperparameter Optimization in Machine and Deep Learning Methods for Decoding Imagined Speech EEG.机器和深度学习方法在解码想象语音 EEG 中的超参数优化评估。
Sensors (Basel). 2020 Aug 17;20(16):4629. doi: 10.3390/s20164629.
4
Decoding imagined speech from EEG signals using hybrid-scale spatial-temporal dilated convolution network.利用混合尺度时空扩张卷积网络从 EEG 信号中解码想象中的语音。
J Neural Eng. 2021 Aug 11;18(4). doi: 10.1088/1741-2552/ac13c0.
5
Neural Decoding of Imagined Speech and Visual Imagery as Intuitive Paradigms for BCI Communication.想象言语和视觉意象的神经解码作为脑机接口通信的直观范式。
IEEE Trans Neural Syst Rehabil Eng. 2020 Dec;28(12):2647-2659. doi: 10.1109/TNSRE.2020.3040289. Epub 2021 Jan 28.
6
Imagined speech can be decoded from low- and cross-frequency intracranial EEG features.想象中的言语可以从低频率和跨频率颅内 EEG 特征中解码出来。
Nat Commun. 2022 Jan 10;13(1):48. doi: 10.1038/s41467-021-27725-3.
7
The nested hierarchy of overt, mouthed, and imagined speech activity evident in intracranial recordings.颅内记录中明显存在的外显、口头和想象言语活动的嵌套层次结构。
Neuroimage. 2023 Apr 1;269:119913. doi: 10.1016/j.neuroimage.2023.119913. Epub 2023 Jan 31.
8
Recording human electrocorticographic (ECoG) signals for neuroscientific research and real-time functional cortical mapping.记录用于神经科学研究和实时功能性皮层图谱绘制的人类皮层脑电图(ECoG)信号。
J Vis Exp. 2012 Jun 26(64):3993. doi: 10.3791/3993.
9
Brain-to-speech decoding will require linguistic and pragmatic data.脑语解码将需要语言和语用数据。
J Neural Eng. 2018 Dec;15(6):063001. doi: 10.1088/1741-2552/aae466. Epub 2018 Sep 26.
10
Neurolinguistics Research Advancing Development of a Direct-Speech Brain-Computer Interface.神经语言学研究推动直接语音脑机接口的发展
iScience. 2018 Oct 26;8:103-125. doi: 10.1016/j.isci.2018.09.016. Epub 2018 Sep 22.

引用本文的文献

1
Improved evaluation of waveform reconstruction in speech decoding based on invasive brain-computer interfaces.基于侵入式脑机接口的语音解码中波形重建的改进评估
Imaging Neurosci (Camb). 2025 Sep 10;3. doi: 10.1162/IMAG.a.146. eCollection 2025.
2
Learning to operate an imagined speech Brain-Computer Interface involves the spatial and frequency tuning of neural activity.学习操作想象中的语音脑机接口涉及神经活动的空间和频率调谐。
Commun Biol. 2025 Feb 20;8(1):271. doi: 10.1038/s42003-025-07464-7.
3
Decoding imagined speech with delay differential analysis.
利用延迟微分分析解码想象中的言语。
Front Hum Neurosci. 2024 May 17;18:1398065. doi: 10.3389/fnhum.2024.1398065. eCollection 2024.
4
Representation of internal speech by single neurons in human supramarginal gyrus.人类缘上回听皮层中单个神经元对内部言语的表征。
Nat Hum Behav. 2024 Jun;8(6):1136-1149. doi: 10.1038/s41562-024-01867-y. Epub 2024 May 13.