• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过脉冲神经元实现的快速S型网络。

Fast sigmoidal networks via spiking neurons.

作者信息

Maass W

机构信息

Institute for Theoretical Computer Science, Technische Universitaet Graz, Austria.

出版信息

Neural Comput. 1997 Feb 15;9(2):279-304. doi: 10.1162/neco.1997.9.2.279.

DOI:10.1162/neco.1997.9.2.279
PMID:9117904
Abstract

We show that networks of relatively realistic mathematical models for biological neurons in principle can simulate arbitrary feedforward sigmoidal neural nets in a way that has previously not been considered. This new approach is based on temporal coding by single spikes (respectively by the timing of synchronous firing in pools of neurons) rather than on the traditional interpretation of analog variables in terms of firing rates. The resulting new simulation is substantially faster and hence more consistent with experimental results about the maximal speed of information processing in cortical neural systems. As a consequence we can show that networks of noisy spiking neurons are "universal approximators" in the sense that they can approximate with regard to temporal coding any given continuous function of several variables. This result holds for a fairly large class of schemes for coding analog variables by firing times of spiking neurons. This new proposal for the possible organization of computations in networks of spiking neurons systems has some interesting consequences for the type of learning rules that would be needed to explain the self-organization of such networks. Finally, the fast and noise-robust implementation of sigmoidal neural nets by temporal coding points to possible new ways of implementing feedforward and recurrent sigmoidal neural nets with pulse stream VLSI.

摘要

我们表明,生物神经元相对现实的数学模型网络原则上能够以一种此前未被考虑的方式模拟任意前馈sigmoidal神经网络。这种新方法基于单个脉冲的时间编码(或神经元群体同步放电的时间),而非基于传统的以放电率来解释模拟变量。由此产生的新模拟速度大幅提高,因此与关于皮层神经系统中信息处理最大速度的实验结果更加一致。结果我们能够证明,有噪声的脉冲发放神经元网络是“通用逼近器”,即就时间编码而言,它们能够逼近几个变量的任何给定连续函数。对于相当多类通过脉冲发放神经元的发放时间来编码模拟变量的方案,这一结果都成立。这种关于脉冲发放神经元系统网络中计算可能组织方式的新提议,对于解释此类网络自组织所需的学习规则类型有一些有趣的影响。最后,通过时间编码实现sigmoidal神经网络的快速且抗噪声特性,为用脉冲流超大规模集成电路实现前馈和递归sigmoidal神经网络指出了可能的新途径。

相似文献

1
Fast sigmoidal networks via spiking neurons.通过脉冲神经元实现的快速S型网络。
Neural Comput. 1997 Feb 15;9(2):279-304. doi: 10.1162/neco.1997.9.2.279.
2
Supervised learning in spiking neural networks with ReSuMe: sequence learning, classification, and spike shifting.基于 ReSuMe 的尖峰神经网络监督学习:序列学习、分类和尖峰转移。
Neural Comput. 2010 Feb;22(2):467-510. doi: 10.1162/neco.2009.11-08-901.
3
Spiking neural networks.脉冲神经网络。
Int J Neural Syst. 2009 Aug;19(4):295-308. doi: 10.1142/S0129065709002002.
4
Investigating the computational power of spiking neurons with non-standard behaviors.研究具有非标准行为的尖峰神经元的计算能力。
Neural Netw. 2013 Jul;43:41-54. doi: 10.1016/j.neunet.2013.01.011. Epub 2013 Feb 9.
5
Learning beyond finite memory in recurrent networks of spiking neurons.在脉冲神经元循环网络中超越有限记忆的学习。
Neural Comput. 2006 Mar;18(3):591-613. doi: 10.1162/089976606775623360.
6
A supervised multi-spike learning algorithm based on gradient descent for spiking neural networks.基于梯度下降的监督多尖峰学习算法在尖峰神经网络中的应用。
Neural Netw. 2013 Jul;43:99-113. doi: 10.1016/j.neunet.2013.02.003. Epub 2013 Feb 16.
7
Neural dynamics as sampling: a model for stochastic computation in recurrent networks of spiking neurons.神经动力学作为采样:脉冲神经元的递归网络中随机计算的模型。
PLoS Comput Biol. 2011 Nov;7(11):e1002211. doi: 10.1371/journal.pcbi.1002211. Epub 2011 Nov 3.
8
On the sample complexity of learning for networks of spiking neurons with nonlinear synaptic interactions.关于具有非线性突触相互作用的脉冲神经元网络学习的样本复杂度
IEEE Trans Neural Netw. 2004 Sep;15(5):995-1001. doi: 10.1109/TNN.2004.832810.
9
A model for fast analog computation based on unreliable synapses.一种基于不可靠突触的快速模拟计算模型。
Neural Comput. 2000 Jul;12(7):1679-704. doi: 10.1162/089976600300015303.
10
STDP-based spiking deep convolutional neural networks for object recognition.基于 STDP 的尖峰深度卷积神经网络的目标识别。
Neural Netw. 2018 Mar;99:56-67. doi: 10.1016/j.neunet.2017.12.005. Epub 2017 Dec 23.

引用本文的文献

1
High-performance deep spiking neural networks with 0.3 spikes per neuron.每个神经元有0.3个脉冲的高性能深度脉冲神经网络。
Nat Commun. 2024 Aug 9;15(1):6793. doi: 10.1038/s41467-024-51110-5.
2
Training spiking neuronal networks to perform motor control using reinforcement and evolutionary learning.利用强化学习和进化学习训练脉冲神经网络以执行运动控制。
Front Comput Neurosci. 2022 Sep 30;16:1017284. doi: 10.3389/fncom.2022.1017284. eCollection 2022.
3
A Complex-Valued Oscillatory Neural Network for Storage and Retrieval of Multidimensional Aperiodic Signals.
一种用于存储和检索多维非周期信号的复值振荡神经网络。
Front Comput Neurosci. 2021 May 24;15:551111. doi: 10.3389/fncom.2021.551111. eCollection 2021.
4
FusionSense: Emotion Classification Using Feature Fusion of Multimodal Data and Deep Learning in a Brain-Inspired Spiking Neural Network.FusionSense:基于脑启发的尖峰神经网络的多模态数据特征融合和深度学习的情感分类。
Sensors (Basel). 2020 Sep 17;20(18):5328. doi: 10.3390/s20185328.
5
An Oscillatory Neural Autoencoder Based on Frequency Modulation and Multiplexing.一种基于频率调制与复用的振荡神经自动编码器。
Front Comput Neurosci. 2018 Jul 10;12:52. doi: 10.3389/fncom.2018.00052. eCollection 2018.
6
Implementing Signature Neural Networks with Spiking Neurons.使用脉冲神经元实现签名神经网络。
Front Comput Neurosci. 2016 Dec 20;10:132. doi: 10.3389/fncom.2016.00132. eCollection 2016.
7
Modeling compositionality by dynamic binding of synfire chains.通过同步激发链的动态绑定对组合性进行建模。
J Comput Neurosci. 2004 Sep-Oct;17(2):179-201. doi: 10.1023/B:JCNS.0000037682.18051.5f.
8
Invariant representations of visual patterns in a temporal population code.视觉模式在时间群体编码中的不变表示。
Proc Natl Acad Sci U S A. 2003 Jan 7;100(1):324-9. doi: 10.1073/pnas.0136977100. Epub 2002 Dec 26.
9
Adapting a feedforward heteroassociative network to Hodgkin-Huxley dynamics.使前馈异联想网络适应霍奇金-赫胥黎动力学。
J Comput Neurosci. 1998 Dec;5(4):353-64. doi: 10.1023/a:1026456411040.