Menon Karthik, Tcheng Thomas, Seale Cairn, Greene David, Morrell Martha, Desai Sharanya Arcot
NeuroPace Inc., Mountain View, CA, United States.
Department of Neurology, Stanford University, Palo Alto, CA, United States.
Front Artif Intell. 2025 Feb 18;8:1502504. doi: 10.3389/frai.2025.1502504. eCollection 2025.
Brain stimulation has become a widely accepted treatment for neurological disorders such as epilepsy and Parkinson's disease. These devices not only deliver therapeutic stimulation but also record brain activity, offering valuable insights into neural dynamics. However, brain recordings during stimulation are often blanked or contaminated by artifact, posing significant challenges for analyzing the acute effects of stimulation. To address these challenges, we propose a transformer-based model, Stim-BERT, trained on a large intracranial EEG (iEEG) dataset to reconstruct brain activity lost during stimulation blanking. To train the Stim-BERT model, 4,653,720 iEEG channels from 380 RNS system patients were tokenized into 3 (or 4) frequency band bins using 1 s non-overlapping windows resulting in a total vocabulary size of 1,000 (or 10,000). Stim-BERT leverages self-supervised learning with masked tokens, inspired by BERT's success in natural language processing, and shows significant improvements over traditional interpolation methods, especially for longer blanking periods. These findings highlight the potential of transformer models for filling in missing time-series neural data, advancing neural signal processing and our efforts to understand the acute effects of brain stimulation.
脑刺激已成为治疗癫痫和帕金森病等神经系统疾病的一种广泛接受的疗法。这些设备不仅能提供治疗性刺激,还能记录大脑活动,为神经动力学提供有价值的见解。然而,刺激过程中的大脑记录往往会被伪迹掩盖或污染,这给分析刺激的急性效应带来了重大挑战。为应对这些挑战,我们提出了一种基于Transformer的模型Stim-BERT,该模型在一个大型颅内脑电图(iEEG)数据集上进行训练,以重建刺激空白期间丢失的大脑活动。为了训练Stim-BERT模型,使用1秒不重叠窗口将来自380名RNS系统患者的4,653,720个iEEG通道划分为3(或4)个频带仓,从而得到总词汇量为1,000(或10,000)。受BERT在自然语言处理方面成功的启发,Stim-BERT利用带有掩码令牌的自监督学习,并且相较于传统插值方法有显著改进,尤其是对于较长的空白期。这些发现凸显了Transformer模型在填补缺失的时间序列神经数据方面的潜力,推动了神经信号处理以及我们理解脑刺激急性效应的努力。