Williams Alex H, Degleris Anthony, Wang Yixin, Linderman Scott W
Department of Statistics, Stanford University, Stanford, CA 94305.
Department of Electrical Engineering, Stanford University, Stanford, CA 94305.
Adv Neural Inf Process Syst. 2020 Dec;33:14350-14361.
Sparse sequences of neural spikes are posited to underlie aspects of working memory [1], motor production [2], and learning [3, 4]. Discovering these sequences in an unsupervised manner is a longstanding problem in statistical neuroscience [5-7]. Promising recent work [4, 8] utilized a convolutive nonnegative matrix factorization model [9] to tackle this challenge. However, this model requires spike times to be discretized, utilizes a sub-optimal least-squares criterion, and does not provide uncertainty estimates for model predictions or estimated parameters. We address each of these shortcomings by developing a point process model that characterizes fine-scale sequences at the level of individual spikes and represents sequence occurrences as a small number of marked events in continuous time. This ultra-sparse representation of sequence events opens new possibilities for spike train modeling. For example, we introduce learnable time warping parameters to model sequences of varying duration, which have been experimentally observed in neural circuits [10]. We demonstrate these advantages on experimental recordings from songbird higher vocal center and rodent hippocampus.
稀疏的神经尖峰序列被认为是工作记忆[1]、运动产生[2]和学习[3,4]等方面的基础。以无监督方式发现这些序列是统计神经科学中的一个长期问题[5-7]。最近有前景的工作[4,8]利用卷积非负矩阵分解模型[9]来应对这一挑战。然而,该模型要求尖峰时间离散化,使用次优的最小二乘准则,并且不提供模型预测或估计参数的不确定性估计。我们通过开发一个点过程模型来解决这些缺点,该模型在单个尖峰水平上表征精细尺度的序列,并将序列出现表示为连续时间中的少量标记事件。这种序列事件的超稀疏表示为尖峰序列建模开辟了新的可能性。例如,我们引入可学习的时间扭曲参数来对持续时间不同的序列进行建模,这在神经回路中已通过实验观察到[10]。我们在鸣禽高级发声中枢和啮齿动物海马体的实验记录上展示了这些优势。