Mascaro M, Amit D J
Dipartimento di Fisica, Università di Roma La Sapienza, Italy.
Network. 1999 Nov;10(4):351-73.
Collective behaviour of neural networks often divides the ensemble of neurons into sub-classes by neuron type; by selective synaptic potentiation; or by mode of stimulation. When the number of classes becomes larger than two, the analysis, even in a mean-field theory, loses its intuitive aspect because of the number of dimensions of the space of dynamical variables. Often one is interested in the behaviour of a reduced set of sub-populations (in focus) and in their dependence on the system's parameters, as in searching for coexistence of spontaneous activity and working memory; in the competition between different working memories; in the competition between working memory and a new stimulus; or in the interaction between selective activity in two different neural modules. For such cases we present a method for reducing the dimensionality of the system to one or two dimensions, even when the total number of populations involved is higher. In the reduced system the familiar intuitive tools apply and the analysis of the dependence of different network states on ambient parameters becomes transparent. Moreover, when the coding of states in focus is sparse, the computational complexity is much reduced. Beyond the analysis, we present a set of detailed examples. We conclude with a discussion of questions of stability in the reduced system.
神经网络的集体行为通常会根据神经元类型、选择性突触增强或刺激模式将神经元集合划分为子类。当类别数量超过两个时,即使在平均场理论中,由于动态变量空间的维度数量,分析也会失去其直观性。通常人们感兴趣的是一组简化的子群体(重点关注对象)的行为及其对系统参数的依赖性,例如寻找自发活动和工作记忆的共存;不同工作记忆之间的竞争;工作记忆与新刺激之间的竞争;或者两个不同神经模块中选择性活动之间的相互作用。对于此类情况,我们提出了一种方法,即使所涉及的群体总数较高,也能将系统的维度降低到一维或二维。在简化系统中,熟悉的直观工具适用,并且对不同网络状态对环境参数的依赖性分析变得清晰明了。此外,当重点关注状态的编码稀疏时,计算复杂度会大大降低。除了分析之外,我们还给出了一组详细的示例。最后我们讨论了简化系统中的稳定性问题。