Department of Statistics, The University of Chicago, Chicago, IL 60637, USA.
Department of Statistics, The University of Chicago, Chicago, IL 60637, USA; Department of Neurobiology, The University of Chicago, Chicago, IL 60637, USA; Department of Neurobiology, Duke University, Durham, NC 27710, USA; Department of Physics, Duke University, Durham, NC 27708, USA.
Neuron. 2018 Jul 11;99(1):227-238.e4. doi: 10.1016/j.neuron.2018.05.038. Epub 2018 Jun 14.
The attractor neural network scenario is a popular scenario for memory storage in the association cortex, but there is still a large gap between models based on this scenario and experimental data. We study a recurrent network model in which both learning rules and distribution of stored patterns are inferred from distributions of visual responses for novel and familiar images in the inferior temporal cortex (ITC). Unlike classical attractor neural network models, our model exhibits graded activity in retrieval states, with distributions of firing rates that are close to lognormal. Inferred learning rules are close to maximizing the number of stored patterns within a family of unsupervised Hebbian learning rules, suggesting that learning rules in ITC are optimized to store a large number of attractor states. Finally, we show that there exist two types of retrieval states: one in which firing rates are constant in time and another in which firing rates fluctuate chaotically.
吸引子神经网络场景是联合皮层中存储记忆的一种流行场景,但基于该场景的模型与实验数据之间仍然存在很大差距。我们研究了一个递归网络模型,其中学习规则和存储模式的分布都是从下颞叶皮层(ITC)中对新的和熟悉的图像的视觉反应分布中推断出来的。与经典的吸引子神经网络模型不同,我们的模型在检索状态下表现出渐变的活动,其发射率分布接近对数正态分布。推断出的学习规则接近于最大化无监督海伯学习规则族内存储的模式数量,这表明 ITC 中的学习规则经过了优化,可以存储大量的吸引子状态。最后,我们表明存在两种检索状态:一种是发射率随时间保持不变,另一种是发射率随机波动。