Chapeau-Blondeau F, Raguin F
Faculté des Sciences, Université d'Angers, France.
IMA J Math Appl Med Biol. 1997 Sep;14(3):227-39.
A principle of information-entropy maximization is introduced in order to characterize the optimal representation of an arbitrarily varying quantity by a neural output confined to a finite interval. We then study the conditions under which a neuron can effectively fulfil the requirements imposed by this information-theoretic optimal principle. We show that this can be achieved with the natural properties available to the neuron. Specifically, we first deduce that neural (monotonically increasing and saturating) nonlinearities are potentially efficient for achieving the entropy maximization, for any given input signal. Secondly, we derive simple laws which adaptively adjust modifiable parameters of a neuron toward maximum entropy. Remarkably, the adaptation laws that realize entropy maximization are found to belong to the class of anti-Hebbian laws (a class having experimental groundings), with a special, yet simple, nonlinear form. The present results highlight the usefulness of general information-theoretic principles in contributing to the understanding of neural systems and their remarkable performances for information processing.
引入信息熵最大化原理,以刻画由局限于有限区间的神经输出对任意变化量的最优表示。然后,我们研究神经元能够有效满足这一信息理论最优原理所施加要求的条件。我们表明,利用神经元现有的自然属性即可实现这一点。具体而言,我们首先推断,对于任何给定的输入信号,神经(单调递增且饱和)非线性对于实现熵最大化可能是有效的。其次,我们推导了简单的定律,可将神经元的可修改参数自适应地调整至最大熵。值得注意的是,发现实现熵最大化的自适应定律属于反赫布定律(一类具有实验依据的定律),具有一种特殊但简单的非线性形式。目前的结果凸显了一般信息理论原理在促进对神经系统及其卓越信息处理性能的理解方面的有用性。