Suppr超能文献

尖峰神经元作为信息瓶颈。

A spiking neuron as information bottleneck.

机构信息

Institute for Theoretical Computer Science, Graz University of Technology, Graz, Austria.

出版信息

Neural Comput. 2010 Aug;22(8):1961-92. doi: 10.1162/neco.2010.08-09-1084.

Abstract

Neurons receive thousands of presynaptic input spike trains while emitting a single output spike train. This drastic dimensionality reduction suggests considering a neuron as a bottleneck for information transmission. Extending recent results, we propose a simple learning rule for the weights of spiking neurons derived from the information bottleneck (IB) framework that minimizes the loss of relevant information transmitted in the output spike train. In the IB framework, relevance of information is defined with respect to contextual information, the latter entering the proposed learning rule as a "third" factor besides pre- and postsynaptic activities. This renders the theoretically motivated learning rule a plausible model for experimentally observed synaptic plasticity phenomena involving three factors. Furthermore, we show that the proposed IB learning rule allows spiking neurons to learn a predictive code, that is, to extract those parts of their input that are predictive for future input.

摘要

神经元在发射单个输出尖峰脉冲序列的同时接收数千个突触前输入尖峰脉冲序列。这种巨大的维度降低表明,应该将神经元视为信息传输的瓶颈。基于最近的研究成果,我们提出了一种基于信息瓶颈(IB)框架的尖峰神经元权重的简单学习规则,该规则最小化了在输出尖峰脉冲序列中传输的相关信息的损失。在 IB 框架中,信息的相关性是相对于上下文信息定义的,后者作为除突触前和突触后活动之外的“第三个”因素进入到所提出的学习规则中。这使得理论上有动机的学习规则成为一种合理的模型,可以解释涉及三个因素的实验观察到的突触可塑性现象。此外,我们还表明,所提出的 IB 学习规则允许尖峰神经元学习预测代码,即提取其输入中对未来输入具有预测性的部分。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验