Center for Adaptive Systems, Department of Cognitive and Neural Systems, Boston University, 677 Beacon Street, Boston, MA, 02215, USA.
Neural Netw. 2012 Jan;25(1):21-9. doi: 10.1016/j.neunet.2011.07.009. Epub 2011 Aug 12.
The activities of neurons vary within small intervals that are bounded both above and below, yet the inputs to these neurons may vary many-fold. How do networks of neurons process distributed input patterns effectively under these conditions? If a large number of input sources intermittently converge on a cell through time, then a serious design problem arises: if cell activities are sensitive to large inputs, then why do not small inputs get lost in internal system noise? If cell activities are sensitive to small inputs, then why do they not all saturate at their maximum values in response to large inputs and thereby become incapable of processing analog differences in inputs across an entire network? Grossberg (1973) solved this noise-saturation dilemma using neurons that obey the membrane, or shunting, equations of neurophysiology interacting in recurrent and non-recurrent on-center off-surround networks, and showed how different signal functions can influence the activity patterns that the network stores in short-term memory. These results demonstrated that maintaining a balance between excitation and inhibition in a neural network is essential to process distributed patterns of inputs and signals without experiencing the catastrophies of noise or saturation. However, shunting on-center off-surround networks only guarantee that cell activities remain sensitive to the relative sizes of inputs and recurrent signals, but not that they will use the full dynamic range that each cell can support. Additional homeostatic plasticity mechanisms are needed to anchor the activities of networks to exploit their full dynamic range. This article shows how mechanisms of synaptic scaling can be incorporated within recurrent on-center off-surround networks in such a way that their pattern processing capabilities, including the ability to make winner-take-all decisions, is preserved. This model generalizes the synaptic scaling model of van Rossum, Bi, & Turrigiano (2000) for a single cell to a pattern-processing network of shunting cells that is capable of short-term memory storage, including a representation of how BDNF may homeostatically scale the strengths of excitatory and inhibitory synapses in opposite directions.
神经元的活动在有上限和下限的小间隔内变化,但这些神经元的输入可能会有很多倍的变化。在这些条件下,神经元网络如何有效地处理分布式输入模式?如果大量的输入源随时间间歇性地汇聚到一个细胞上,那么就会出现一个严重的设计问题:如果细胞活动对大输入敏感,那么小输入为什么不会在内部系统噪声中丢失?如果细胞活动对小输入敏感,那么为什么它们不会在响应大输入时全部饱和在最大值,从而无法处理整个网络中输入的模拟差异?Grossberg(1973)使用遵循神经生理学膜或分流方程的神经元解决了这个噪声饱和困境,这些神经元在递归和非递归的中心兴奋周围抑制网络中相互作用,并展示了不同的信号函数如何影响网络在短期记忆中存储的活动模式。这些结果表明,在神经网络中保持兴奋和抑制之间的平衡对于处理分布式输入和信号模式是至关重要的,而不会经历噪声或饱和的灾难。然而,分流中心兴奋周围抑制网络仅保证细胞活动仍然对输入和递归信号的相对大小敏感,但不能保证它们将使用每个细胞可以支持的全部动态范围。需要额外的自稳态可塑性机制来将网络的活动锚定到利用其全部动态范围。本文展示了如何在递归中心兴奋周围抑制网络中合并突触缩放机制,以使它们的模式处理能力,包括做出胜者全取的决策的能力得以保留。该模型将 van Rossum、Bi 和 Turrigiano(2000)的单个细胞的突触缩放模型推广到能够进行短期记忆存储的分流细胞模式处理网络,包括 BDNF 如何以相反的方向对兴奋性和抑制性突触的强度进行自稳态缩放的表示。