• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

具有传输非线性的并行突触增强了神经元的分类能力。

Parallel synapses with transmission nonlinearities enhance neuronal classification capacity.

作者信息

Song Yuru, Benna Marcus K

机构信息

Neurosciences Graduate Program, University of California, San Diego, La Jolla, California, United States of America.

Department of Neurobiology, School of Biological Sciences, University of California, San Diego, La Jolla, California, United States of America.

出版信息

PLoS Comput Biol. 2025 May 9;21(5):e1012285. doi: 10.1371/journal.pcbi.1012285. eCollection 2025 May.

DOI:10.1371/journal.pcbi.1012285
PMID:40344022
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC12063901/
Abstract

Cortical neurons often establish multiple synaptic contacts with the same postsynaptic neuron. To avoid functional redundancy of these parallel synapses, it is crucial that each synapse exhibits distinct computational properties. Here we model the current to the soma contributed by each synapse as a sigmoidal transmission function of its presynaptic input, with learnable parameters such as amplitude, slope, and threshold. We evaluate the classification capacity of a neuron equipped with such nonlinear parallel synapses, and show that with a small number of parallel synapses per axon, it substantially exceeds that of the Perceptron. Furthermore, the number of correctly classified data points can increase superlinearly as the number of presynaptic axons grows. When training with an unrestricted number of parallel synapses, our model neuron can effectively implement an arbitrary aggregate transmission function for each axon, constrained only by monotonicity. Nevertheless, successful learning in the model neuron often requires only a small number of parallel synapses. We also apply these parallel synapses in a feedforward neural network trained to classify MNIST images, and show that they can increase the test accuracy. This demonstrates that multiple nonlinear synapses per input axon can substantially enhance a neuron's computational power.

摘要

皮层神经元通常会与同一个突触后神经元建立多个突触连接。为避免这些并行突触的功能冗余,每个突触展现出独特的计算特性至关重要。在此,我们将每个突触对胞体的电流建模为其突触前输入的S型传递函数,具有诸如幅度、斜率和阈值等可学习参数。我们评估配备此类非线性并行突触的神经元的分类能力,并表明每个轴突具有少量并行突触时,其分类能力大幅超过感知器。此外,随着突触前轴突数量的增加,正确分类的数据点数量可以超线性增加。当使用不受限制数量的并行突触进行训练时,我们的模型神经元可以有效地为每个轴突实现任意聚合传递函数,仅受单调性约束。然而,模型神经元中的成功学习通常仅需要少量并行突触。我们还将这些并行突触应用于经过训练以对MNIST图像进行分类的前馈神经网络中,并表明它们可以提高测试准确率。这表明每个输入轴突的多个非线性突触可以显著增强神经元的计算能力。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bfd0/12063901/be2eb75c3bfe/pcbi.1012285.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bfd0/12063901/1a6d2ff81518/pcbi.1012285.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bfd0/12063901/3ddff4ef4d49/pcbi.1012285.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bfd0/12063901/27c00385a2e4/pcbi.1012285.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bfd0/12063901/357b87874309/pcbi.1012285.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bfd0/12063901/be2eb75c3bfe/pcbi.1012285.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bfd0/12063901/1a6d2ff81518/pcbi.1012285.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bfd0/12063901/3ddff4ef4d49/pcbi.1012285.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bfd0/12063901/27c00385a2e4/pcbi.1012285.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bfd0/12063901/357b87874309/pcbi.1012285.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bfd0/12063901/be2eb75c3bfe/pcbi.1012285.g005.jpg

相似文献

1
Parallel synapses with transmission nonlinearities enhance neuronal classification capacity.具有传输非线性的并行突触增强了神经元的分类能力。
PLoS Comput Biol. 2025 May 9;21(5):e1012285. doi: 10.1371/journal.pcbi.1012285. eCollection 2025 May.
2
Parallel Synapses with Transmission Nonlinearities Enhance Neuronal Classification Capacity.具有传递非线性的并行突触增强神经元分类能力。
bioRxiv. 2024 Jul 4:2024.07.01.601490. doi: 10.1101/2024.07.01.601490.
3
Role of synaptic dynamics and heterogeneity in neuronal learning of temporal code.突触动态和异质性在时间编码神经元学习中的作用。
J Neurophysiol. 2013 Nov;110(10):2275-86. doi: 10.1152/jn.00454.2013. Epub 2013 Aug 7.
4
A New Computational Model for Astrocytes and Their Role in Biologically Realistic Neural Networks.一种新的星形胶质细胞计算模型及其在生物逼真神经网络中的作用。
Comput Intell Neurosci. 2018 Jul 5;2018:3689487. doi: 10.1155/2018/3689487. eCollection 2018.
5
Neuron as a reward-modulated combinatorial switch and a model of learning behavior.神经元作为一种受奖励调节的组合开关和学习行为的模型。
Neural Netw. 2013 Oct;46:62-74. doi: 10.1016/j.neunet.2013.04.010. Epub 2013 May 6.
6
Local specification of relative strengths of synapses between different abdominal stretch-receptor axons and their common target neurons.不同腹部牵张感受器轴突与其共同靶神经元之间突触相对强度的局部特异性。
J Neurosci. 2001 Mar 1;21(5):1645-55. doi: 10.1523/JNEUROSCI.21-05-01645.2001.
7
Reducing the variability of neural responses: a computational theory of spike-timing-dependent plasticity.减少神经反应的变异性:一种尖峰时间依赖性可塑性的计算理论。
Neural Comput. 2007 Feb;19(2):371-403. doi: 10.1162/neco.2007.19.2.371.
8
Synaptic dynamics: linear model and adaptation algorithm.突触动力学:线性模型与自适应算法。
Neural Netw. 2014 Aug;56:49-68. doi: 10.1016/j.neunet.2014.04.001. Epub 2014 Apr 28.
9
What can a neuron learn with spike-timing-dependent plasticity?神经元通过尖峰时间依赖性可塑性能够学习什么?
Neural Comput. 2005 Nov;17(11):2337-82. doi: 10.1162/0899766054796888.
10
DoGNet: A deep architecture for synapse detection in multiplexed fluorescence images.DoGNet:一种用于多路复用荧光图像中突触检测的深度架构。
PLoS Comput Biol. 2019 May 13;15(5):e1007012. doi: 10.1371/journal.pcbi.1007012. eCollection 2019 May.

本文引用的文献

1
Might a Single Neuron Solve Interesting Machine Learning Problems Through Successive Computations on Its Dendritic Tree?单个神经元能否通过在其树突上连续计算来解决有趣的机器学习问题?
Neural Comput. 2021 May 13;33(6):1554-1571. doi: 10.1162/neco_a_01390.
2
Activation function dependence of the storage capacity of treelike neural networks.树状神经网络存储容量的激活函数依赖性
Phys Rev E. 2021 Feb;103(2):L020301. doi: 10.1103/PhysRevE.103.L020301.
3
Structure and function of a neocortical synapse.新皮层突触的结构与功能。
Nature. 2021 Mar;591(7848):111-116. doi: 10.1038/s41586-020-03134-2. Epub 2021 Jan 13.
4
Illuminating dendritic function with computational models.用计算模型照亮树突功能。
Nat Rev Neurosci. 2020 Jun;21(6):303-321. doi: 10.1038/s41583-020-0301-7. Epub 2020 May 11.
5
Redundancy in synaptic connections enables neurons to learn optimally.突触连接的冗余使神经元能够最优地学习。
Proc Natl Acad Sci U S A. 2018 Jul 17;115(29):E6871-E6879. doi: 10.1073/pnas.1803274115. Epub 2018 Jul 2.
6
Axonal synapse sorting in medial entorhinal cortex.内侧隔核的轴突突触分类。
Nature. 2017 Sep 28;549(7673):469-475. doi: 10.1038/nature24005. Epub 2017 Sep 20.
7
Neurotransmitter Switching in the Developing and Adult Brain.发育中和成年脑中的神经递质转换。
Annu Rev Neurosci. 2017 Jul 25;40:1-19. doi: 10.1146/annurev-neuro-072116-031204. Epub 2017 Mar 6.
8
Is cortical connectivity optimized for storing information?皮层连接是否优化用于存储信息?
Nat Neurosci. 2016 May;19(5):749-755. doi: 10.1038/nn.4286. Epub 2016 Apr 11.
9
Anatomy and function of an excitatory network in the visual cortex.视觉皮层中一个兴奋性网络的解剖结构与功能
Nature. 2016 Apr 21;532(7599):370-4. doi: 10.1038/nature17192. Epub 2016 Mar 28.
10
Reconstruction and Simulation of Neocortical Microcircuitry.重建与模拟新皮层微电路
Cell. 2015 Oct 8;163(2):456-92. doi: 10.1016/j.cell.2015.09.029.