Computational Brain Science Lab, KTH Royal Institute of Technology, Stockholm, Sweden.
Mathematics Department, Stockholm University, Stockholm, Sweden.
PLoS One. 2019 Aug 1;14(8):e0220161. doi: 10.1371/journal.pone.0220161. eCollection 2019.
From memorizing a musical tune to navigating a well known route, many of our underlying behaviors have a strong temporal component. While the mechanisms behind the sequential nature of the underlying brain activity are likely multifarious and multi-scale, in this work we attempt to characterize to what degree some of this properties can be explained as a consequence of simple associative learning. To this end, we employ a parsimonious firing-rate attractor network equipped with the Hebbian-like Bayesian Confidence Propagating Neural Network (BCPNN) learning rule relying on synaptic traces with asymmetric temporal characteristics. The proposed network model is able to encode and reproduce temporal aspects of the input, and offers internal control of the recall dynamics by gain modulation. We provide an analytical characterisation of the relationship between the structure of the weight matrix, the dynamical network parameters and the temporal aspects of sequence recall. We also present a computational study of the performance of the system under the effects of noise for an extensive region of the parameter space. Finally, we show how the inclusion of modularity in our network structure facilitates the learning and recall of multiple overlapping sequences even in a noisy regime.
从记忆一首乐曲到熟悉一条路线,我们的许多基本行为都具有很强的时间成分。虽然潜在大脑活动的顺序性质背后的机制可能多种多样且具有多尺度性,但在这项工作中,我们试图确定其中一些特性在多大程度上可以解释为简单联想学习的结果。为此,我们采用了一个简约的放电率吸引网络,该网络配备了类似于赫布的贝叶斯置信传播神经网络 (BCPNN) 学习规则,该规则依赖于具有不对称时间特性的突触痕迹。所提出的网络模型能够对输入的时间方面进行编码和再现,并通过增益调制提供内部控制回忆动力学的能力。我们对权重矩阵的结构、动态网络参数和序列回忆的时间方面之间的关系进行了分析。我们还对系统在参数空间的广泛区域受到噪声影响的情况下的性能进行了计算研究。最后,我们展示了如何在我们的网络结构中包含模块化,以便即使在嘈杂的环境中也能促进多个重叠序列的学习和回忆。