• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

被动非线性树突相互作用作为尖峰神经网络中的计算资源。

Passive Nonlinear Dendritic Interactions as a Computational Resource in Spiking Neural Networks.

机构信息

Centre for Theoretical Neuroscience, University of Waterloo, Waterloo, Ontario, N2L 3G1, Canada

出版信息

Neural Comput. 2021 Jan;33(1):96-128. doi: 10.1162/neco_a_01338. Epub 2020 Oct 20.

DOI:10.1162/neco_a_01338
PMID:33080158
Abstract

Nonlinear interactions in the dendritic tree play a key role in neural computation. Nevertheless, modeling frameworks aimed at the construction of large-scale, functional spiking neural networks, such as the Neural Engineering Framework, tend to assume a linear superposition of postsynaptic currents. In this letter, we present a series of extensions to the Neural Engineering Framework that facilitate the construction of networks incorporating Dale's principle and nonlinear conductance-based synapses. We apply these extensions to a two-compartment LIF neuron that can be seen as a simple model of passive dendritic computation. We show that it is possible to incorporate neuron models with input-dependent nonlinearities into the Neural Engineering Framework without compromising high-level function and that nonlinear postsynaptic currents can be systematically exploited to compute a wide variety of multivariate, band-limited functions, including the Euclidean norm, controlled shunting, and nonnegative multiplication. By avoiding an additional source of spike noise, the function approximation accuracy of a single layer of two-compartment LIF neurons is on a par with or even surpasses that of two-layer spiking neural networks up to a certain target function bandwidth.

摘要

树突中的非线性相互作用在神经计算中起着关键作用。然而,旨在构建大规模、功能齐全的尖峰神经网络的建模框架,如神经工程框架,往往假设突触后电流的线性叠加。在这封信中,我们提出了一系列对神经工程框架的扩展,以方便构建包含戴尔原则和基于非线性电导的突触的网络。我们将这些扩展应用于一个两室 LIF 神经元,它可以被视为被动树突计算的简单模型。我们表明,可以将具有输入相关非线性的神经元模型纳入神经工程框架中,而不会影响高级功能,并且可以系统地利用非线性突触后电流来计算各种多元、带限函数,包括欧几里得范数、控制分流和非负乘法。通过避免额外的尖峰噪声源,两层 LIF 神经元的单层函数逼近精度与一定目标函数带宽内的两层尖峰神经网络的精度相当,甚至超过。

相似文献

1
Passive Nonlinear Dendritic Interactions as a Computational Resource in Spiking Neural Networks.被动非线性树突相互作用作为尖峰神经网络中的计算资源。
Neural Comput. 2021 Jan;33(1):96-128. doi: 10.1162/neco_a_01338. Epub 2020 Oct 20.
2
The passive properties of dendrites modulate the propagation of slowly-varying firing rate in feedforward networks.树突的被动特性调节前馈网络中缓慢变化的放电率的传播。
Neural Netw. 2022 Jun;150:377-391. doi: 10.1016/j.neunet.2022.03.001. Epub 2022 Mar 9.
3
Approximating Nonlinear Functions With Latent Boundaries in Low-Rank Excitatory-Inhibitory Spiking Networks.用低秩兴奋-抑制尖峰网络中的潜在边界逼近非线性函数。
Neural Comput. 2024 Apr 23;36(5):803-857. doi: 10.1162/neco_a_01658.
4
Passive dendrites enable single neurons to compute linearly non-separable functions.被动树突使单个神经元能够计算线性不可分的函数。
PLoS Comput Biol. 2013;9(2):e1002867. doi: 10.1371/journal.pcbi.1002867. Epub 2013 Feb 28.
5
Do Biological Constraints Impair Dendritic Computation?生物约束是否会损害树突计算?
Neuroscience. 2022 May 1;489:262-274. doi: 10.1016/j.neuroscience.2021.07.036. Epub 2021 Aug 6.
6
Systems-based analysis of dendritic nonlinearities reveals temporal feature extraction in mouse L5 cortical neurons.基于系统的树突非线性分析揭示了小鼠L5皮层神经元的时间特征提取。
J Neurophysiol. 2017 Jun 1;117(6):2188-2208. doi: 10.1152/jn.00951.2016. Epub 2017 Mar 1.
7
Neurons with Multiplicative Interactions of Nonlinear Synapses.具有非线性突触相乘相互作用的神经元。
Int J Neural Syst. 2019 Oct;29(8):1950012. doi: 10.1142/S0129065719500126. Epub 2019 Mar 26.
8
Small universal spiking neural P systems with dendritic/axonal delays and dendritic trunk/feedback.具有树突/轴突延迟和树突干/反馈的小型通用尖峰神经网络 P 系统。
Neural Netw. 2021 Jun;138:126-139. doi: 10.1016/j.neunet.2021.02.010. Epub 2021 Feb 16.
9
Computing with the leaky integrate-and-fire neuron: logarithmic computation and multiplication.基于漏电积分发放神经元的计算:对数计算与乘法运算
Neural Comput. 1997 Feb 15;9(2):305-18. doi: 10.1162/neco.1997.9.2.305.
10
Linking structure and activity in nonlinear spiking networks.非线性脉冲发放网络中结构与活动的关联
PLoS Comput Biol. 2017 Jun 23;13(6):e1005583. doi: 10.1371/journal.pcbi.1005583. eCollection 2017 Jun.

引用本文的文献

1
Brain-like hardware, do we need it?类脑硬件,我们需要它吗?
Front Neurosci. 2024 Dec 16;18:1465789. doi: 10.3389/fnins.2024.1465789. eCollection 2024.
2
A survey and perspective on neuromorphic continual learning systems.神经形态持续学习系统的综述与展望
Front Neurosci. 2023 May 4;17:1149410. doi: 10.3389/fnins.2023.1149410. eCollection 2023.
3
Biologically-Based Computation: How Neural Details and Dynamics Are Suited for Implementing a Variety of Algorithms.基于生物学的计算:神经细节与动力学如何适用于实现各种算法。
Brain Sci. 2023 Jan 31;13(2):245. doi: 10.3390/brainsci13020245.
4
Periodicity Pitch Perception Part III: Sensibility and Pachinko Volatility.周期性音高感知第三部分:敏感性与弹珠机波动性
Front Neurosci. 2022 Mar 8;16:736642. doi: 10.3389/fnins.2022.736642. eCollection 2022.