Xie Xiaohui, Hahnloser Richard H R, Seung H Sebastian
Department of Brain and Cognitive Sciences, Massachusetts Institute of Technology, Cambridge, MA 02139, USA.
Neural Comput. 2002 Nov;14(11):2627-46. doi: 10.1162/089976602760408008.
Winner-take-all networks have been proposed to underlie many of the brain's fundamental computational abilities. However, not much is known about how to extend the grouping of potential winners in these networks beyond single neuron or uniformly arranged groups of neurons. We show that competition between arbitrary groups of neurons can be realized by organizing lateral inhibition in linear threshold networks. Given a collection of potentially overlapping groups (with the exception of some degenerate cases), the lateral inhibition results in network dynamics such that any permitted set of neurons that can be coactivated by some input at a stable steady state is contained in one of the groups. The information about the input is preserved in this operation. The activity level of a neuron in a permitted set corresponds to its stimulus strength, amplified by some constant. Sets of neurons that are not part of a group cannot be coactivated by any input at a stable steady state. We analyze the storage capacity of such a network for random groups--the number of random groups the network can store as permitted sets without creating too many spurious ones. In this framework, we calculate the optimal sparsity of the groups (maximizing group entropy). We find that for dense inputs, the optimal sparsity is unphysiologically small. However, when the inputs and the groups are equally sparse, we derive a more plausible optimal sparsity. We believe our results are the first steps toward attractor theories in hybrid analog-digital networks.
胜者全得网络被认为是大脑许多基本计算能力的基础。然而,对于如何将这些网络中潜在胜者的分组扩展到单个神经元或均匀排列的神经元组之外,我们了解得并不多。我们表明,通过在线性阈值网络中组织侧向抑制,可以实现任意神经元组之间的竞争。给定一组可能重叠的组(除了一些退化情况),侧向抑制会导致网络动态变化,使得在稳定稳态下可以被某些输入共同激活的任何一组允许的神经元都包含在其中一个组中。关于输入的信息在这个操作中得以保留。允许组中神经元的活动水平与其刺激强度相对应,并被某个常数放大。不属于一个组的神经元集在稳定稳态下不能被任何输入共同激活。我们分析了这种网络对随机组的存储容量——网络可以存储为允许集而不产生太多虚假集的随机组的数量。在这个框架下,我们计算了组的最优稀疏度(最大化组熵)。我们发现,对于密集输入,最优稀疏度在生理上是不切实际的小。然而,当输入和组同样稀疏时,我们得出了一个更合理的最优稀疏度。我们相信我们的结果是迈向混合模拟 - 数字网络吸引子理论的第一步。