Suppr超能文献

使用Stim-BERT在脑刺激期间重建信号:一个在数百万个颅内脑电图(iEEG)文件上训练的自监督学习模型。

Reconstructing signal during brain stimulation with Stim-BERT: a self-supervised learning model trained on millions of iEEG files.

作者信息

Menon Karthik, Tcheng Thomas, Seale Cairn, Greene David, Morrell Martha, Desai Sharanya Arcot

机构信息

NeuroPace Inc., Mountain View, CA, United States.

Department of Neurology, Stanford University, Palo Alto, CA, United States.

出版信息

Front Artif Intell. 2025 Feb 18;8:1502504. doi: 10.3389/frai.2025.1502504. eCollection 2025.

Abstract

Brain stimulation has become a widely accepted treatment for neurological disorders such as epilepsy and Parkinson's disease. These devices not only deliver therapeutic stimulation but also record brain activity, offering valuable insights into neural dynamics. However, brain recordings during stimulation are often blanked or contaminated by artifact, posing significant challenges for analyzing the acute effects of stimulation. To address these challenges, we propose a transformer-based model, Stim-BERT, trained on a large intracranial EEG (iEEG) dataset to reconstruct brain activity lost during stimulation blanking. To train the Stim-BERT model, 4,653,720 iEEG channels from 380 RNS system patients were tokenized into 3 (or 4) frequency band bins using 1 s non-overlapping windows resulting in a total vocabulary size of 1,000 (or 10,000). Stim-BERT leverages self-supervised learning with masked tokens, inspired by BERT's success in natural language processing, and shows significant improvements over traditional interpolation methods, especially for longer blanking periods. These findings highlight the potential of transformer models for filling in missing time-series neural data, advancing neural signal processing and our efforts to understand the acute effects of brain stimulation.

摘要

脑刺激已成为治疗癫痫和帕金森病等神经系统疾病的一种广泛接受的疗法。这些设备不仅能提供治疗性刺激,还能记录大脑活动,为神经动力学提供有价值的见解。然而,刺激过程中的大脑记录往往会被伪迹掩盖或污染,这给分析刺激的急性效应带来了重大挑战。为应对这些挑战,我们提出了一种基于Transformer的模型Stim-BERT,该模型在一个大型颅内脑电图(iEEG)数据集上进行训练,以重建刺激空白期间丢失的大脑活动。为了训练Stim-BERT模型,使用1秒不重叠窗口将来自380名RNS系统患者的4,653,720个iEEG通道划分为3(或4)个频带仓,从而得到总词汇量为1,000(或10,000)。受BERT在自然语言处理方面成功的启发,Stim-BERT利用带有掩码令牌的自监督学习,并且相较于传统插值方法有显著改进,尤其是对于较长的空白期。这些发现凸显了Transformer模型在填补缺失的时间序列神经数据方面的潜力,推动了神经信号处理以及我们理解脑刺激急性效应的努力。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/926f/11876146/58408249cdcc/frai-08-1502504-g001.jpg

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验