Suppr超能文献

非线性感觉神经元的信号转码:信息熵最大化、最优传递函数和反赫布学习适应

Signal transcoding by nonlinear sensory neurons: information-entropy maximization, optimal transfer function, and anti-Hebbian adaptation.

作者信息

Chapeau-Blondeau F, Raguin F

机构信息

Faculté des Sciences, Université d'Angers, France.

出版信息

IMA J Math Appl Med Biol. 1997 Sep;14(3):227-39.

PMID:9306676
Abstract

A principle of information-entropy maximization is introduced in order to characterize the optimal representation of an arbitrarily varying quantity by a neural output confined to a finite interval. We then study the conditions under which a neuron can effectively fulfil the requirements imposed by this information-theoretic optimal principle. We show that this can be achieved with the natural properties available to the neuron. Specifically, we first deduce that neural (monotonically increasing and saturating) nonlinearities are potentially efficient for achieving the entropy maximization, for any given input signal. Secondly, we derive simple laws which adaptively adjust modifiable parameters of a neuron toward maximum entropy. Remarkably, the adaptation laws that realize entropy maximization are found to belong to the class of anti-Hebbian laws (a class having experimental groundings), with a special, yet simple, nonlinear form. The present results highlight the usefulness of general information-theoretic principles in contributing to the understanding of neural systems and their remarkable performances for information processing.

摘要

引入信息熵最大化原理,以刻画由局限于有限区间的神经输出对任意变化量的最优表示。然后,我们研究神经元能够有效满足这一信息理论最优原理所施加要求的条件。我们表明,利用神经元现有的自然属性即可实现这一点。具体而言,我们首先推断,对于任何给定的输入信号,神经(单调递增且饱和)非线性对于实现熵最大化可能是有效的。其次,我们推导了简单的定律,可将神经元的可修改参数自适应地调整至最大熵。值得注意的是,发现实现熵最大化的自适应定律属于反赫布定律(一类具有实验依据的定律),具有一种特殊但简单的非线性形式。目前的结果凸显了一般信息理论原理在促进对神经系统及其卓越信息处理性能的理解方面的有用性。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验