• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

梯度聚类神经元:一种通过树突非线性、结构可塑性和梯度下降来学习解决分类任务的模型神经元。

The gradient clusteron: A model neuron that learns to solve classification tasks via dendritic nonlinearities, structural plasticity, and gradient descent.

机构信息

Edmond and Lily Safra Center for Brain Sciences, the Hebrew University of Jerusalem, Jerusalem, Israel.

Department of Neurobiology, the Hebrew University of Jerusalem, Jerusalem, Israel.

出版信息

PLoS Comput Biol. 2021 May 24;17(5):e1009015. doi: 10.1371/journal.pcbi.1009015. eCollection 2021 May.

DOI:10.1371/journal.pcbi.1009015
PMID:34029309
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8177649/
Abstract

Synaptic clustering on neuronal dendrites has been hypothesized to play an important role in implementing pattern recognition. Neighboring synapses on a dendritic branch can interact in a synergistic, cooperative manner via nonlinear voltage-dependent mechanisms, such as NMDA receptors. Inspired by the NMDA receptor, the single-branch clusteron learning algorithm takes advantage of location-dependent multiplicative nonlinearities to solve classification tasks by randomly shuffling the locations of "under-performing" synapses on a model dendrite during learning ("structural plasticity"), eventually resulting in synapses with correlated activity being placed next to each other on the dendrite. We propose an alternative model, the gradient clusteron, or G-clusteron, which uses an analytically-derived gradient descent rule where synapses are "attracted to" or "repelled from" each other in an input- and location-dependent manner. We demonstrate the classification ability of this algorithm by testing it on the MNIST handwritten digit dataset and show that, when using a softmax activation function, the accuracy of the G-clusteron on the all-versus-all MNIST task (85%) approaches that of logistic regression (93%). In addition to the location update rule, we also derive a learning rule for the synaptic weights of the G-clusteron ("functional plasticity") and show that a G-clusteron that utilizes the weight update rule can achieve ~89% accuracy on the MNIST task. We also show that a G-clusteron with both the weight and location update rules can learn to solve the XOR problem from arbitrary initial conditions.

摘要

神经元树突上的突触簇集被假设在实现模式识别中发挥重要作用。树突分支上的相邻突触可以通过非线性电压依赖性机制(如 NMDA 受体)以协同、合作的方式相互作用。受 NMDA 受体的启发,单枝簇集学习算法利用位置相关的乘法非线性来解决分类任务,方法是在学习过程中(“结构可塑性”)随机打乱模型树突上“表现不佳”的突触的位置,最终导致具有相关活动的突触彼此相邻地放置在树突上。我们提出了一种替代模型,即梯度簇集(gradient clusteron,或 G-clusteron),它使用一种解析衍生的梯度下降规则,其中突触以输入和位置依赖的方式“相互吸引”或“相互排斥”。我们通过在 MNIST 手写数字数据集上测试该算法来证明其分类能力,并表明在使用 softmax 激活函数时,G-clusteron 在全对全 MNIST 任务上的准确率(约 85%)接近逻辑回归(约 93%)。除了位置更新规则外,我们还为 G-clusteron 的突触权重推导了一个学习规则(“功能可塑性”),并表明使用权重更新规则的 G-clusteron 可以在 MNIST 任务上实现约 89%的准确率。我们还表明,具有权重和位置更新规则的 G-clusteron 可以从任意初始条件学习解决异或问题。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/532d97364452/pcbi.1009015.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/641b94f3717f/pcbi.1009015.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/c2701784af0a/pcbi.1009015.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/ba023a8d3034/pcbi.1009015.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/38a4be816102/pcbi.1009015.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/85ab14e673b0/pcbi.1009015.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/532d97364452/pcbi.1009015.g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/641b94f3717f/pcbi.1009015.g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/c2701784af0a/pcbi.1009015.g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/ba023a8d3034/pcbi.1009015.g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/38a4be816102/pcbi.1009015.g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/85ab14e673b0/pcbi.1009015.g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/a448/8177649/532d97364452/pcbi.1009015.g006.jpg

相似文献

1
The gradient clusteron: A model neuron that learns to solve classification tasks via dendritic nonlinearities, structural plasticity, and gradient descent.梯度聚类神经元:一种通过树突非线性、结构可塑性和梯度下降来学习解决分类任务的模型神经元。
PLoS Comput Biol. 2021 May 24;17(5):e1009015. doi: 10.1371/journal.pcbi.1009015. eCollection 2021 May.
2
A synaptic learning rule for exploiting nonlinear dendritic computation.一种利用非线性树突计算的突触学习规则。
Neuron. 2021 Dec 15;109(24):4001-4017.e10. doi: 10.1016/j.neuron.2021.09.044. Epub 2021 Oct 28.
3
Neuron as a reward-modulated combinatorial switch and a model of learning behavior.神经元作为一种受奖励调节的组合开关和学习行为的模型。
Neural Netw. 2013 Oct;46:62-74. doi: 10.1016/j.neunet.2013.04.010. Epub 2013 May 6.
4
Somato-dendritic Synaptic Plasticity and Error-backpropagation in Active Dendrites.主动树突中的体-树突突触可塑性与误差反向传播
PLoS Comput Biol. 2016 Feb 3;12(2):e1004638. doi: 10.1371/journal.pcbi.1004638. eCollection 2016 Feb.
5
Learning by the dendritic prediction of somatic spiking.通过树突预测躯体发放进行学习。
Neuron. 2014 Feb 5;81(3):521-8. doi: 10.1016/j.neuron.2013.11.030.
6
Multiclass Classification by Adaptive Network of Dendritic Neurons with Binary Synapses Using Structural Plasticity.使用结构可塑性的具有二元突触的树突神经元自适应网络进行多类分类
Front Neurosci. 2016 Mar 31;10:113. doi: 10.3389/fnins.2016.00113. eCollection 2016.
7
Reinforcement learning through modulation of spike-timing-dependent synaptic plasticity.通过调节尖峰时间依赖性突触可塑性进行强化学习。
Neural Comput. 2007 Jun;19(6):1468-502. doi: 10.1162/neco.2007.19.6.1468.
8
Synaptic Plasticity Depends on the Fine-Scale Input Pattern in Thin Dendrites of CA1 Pyramidal Neurons.突触可塑性取决于 CA1 锥体神经元薄树突中的精细输入模式。
J Neurosci. 2020 Mar 25;40(13):2593-2605. doi: 10.1523/JNEUROSCI.2071-19.2020. Epub 2020 Feb 11.
9
Hardware-amenable structural learning for spike-based pattern classification using a simple model of active dendrites.使用简单的主动树突模型进行基于尖峰模式分类的硬件适用结构学习。
Neural Comput. 2015 Apr;27(4):845-97. doi: 10.1162/NECO_a_00713. Epub 2015 Mar 3.
10
What can a neuron learn with spike-timing-dependent plasticity?神经元通过尖峰时间依赖性可塑性能够学习什么?
Neural Comput. 2005 Nov;17(11):2337-82. doi: 10.1162/0899766054796888.

引用本文的文献

1
Biophysical and computational insights from modeling human cortical pyramidal neurons.对人类皮质锥体细胞建模的生物物理与计算见解
Front Neurosci. 2025 Jul 9;19:1579715. doi: 10.3389/fnins.2025.1579715. eCollection 2025.
2
The calcitron: A simple neuron model that implements many learning rules via the calcium control hypothesis.钙控神经元:一种通过钙控制假说实现多种学习规则的简单神经元模型。
PLoS Comput Biol. 2025 Jan 29;21(1):e1012754. doi: 10.1371/journal.pcbi.1012754. eCollection 2025 Jan.
3
Cellular computation and cognition.

本文引用的文献

1
The covariance perceptron: A new paradigm for classification and processing of time series in recurrent neuronal networks.协方差感知机:一种用于递归神经元网络中时间序列分类和处理的新范例。
PLoS Comput Biol. 2020 Oct 12;16(10):e1008127. doi: 10.1371/journal.pcbi.1008127. eCollection 2020 Oct.
2
Perceptron Learning and Classification in a Modeled Cortical Pyramidal Cell.模拟皮质锥体细胞中的感知器学习与分类
Front Comput Neurosci. 2020 Apr 24;14:33. doi: 10.3389/fncom.2020.00033. eCollection 2020.
3
Dendritic action potentials and computation in human layer 2/3 cortical neurons.
细胞计算与认知。
Front Comput Neurosci. 2023 Nov 23;17:1107876. doi: 10.3389/fncom.2023.1107876. eCollection 2023.
4
A GPU-based computational framework that bridges neuron simulation and artificial intelligence.一种基于 GPU 的计算框架,用于连接神经元模拟和人工智能。
Nat Commun. 2023 Sep 18;14(1):5798. doi: 10.1038/s41467-023-41553-7.
5
Asymmetric Voltage Attenuation in Dendrites Can Enable Hierarchical Heterosynaptic Plasticity.树突中的不对称电压衰减可实现分级异突触可塑性。
eNeuro. 2023 Jul 17;10(7). doi: 10.1523/ENEURO.0014-23.2023. Print 2023 Jul.
6
Periodicity Pitch Perception Part III: Sensibility and Pachinko Volatility.周期性音高感知第三部分:敏感性与弹珠机波动性
Front Neurosci. 2022 Mar 8;16:736642. doi: 10.3389/fnins.2022.736642. eCollection 2022.
7
A synaptic learning rule for exploiting nonlinear dendritic computation.一种利用非线性树突计算的突触学习规则。
Neuron. 2021 Dec 15;109(24):4001-4017.e10. doi: 10.1016/j.neuron.2021.09.044. Epub 2021 Oct 28.
人类皮层 2/3 层神经元的树突动作电位和计算。
Science. 2020 Jan 3;367(6473):83-87. doi: 10.1126/science.aax6239.
4
Functional clustering of dendritic activity during decision-making.决策过程中树突活动的功能聚类。
Elife. 2019 Oct 30;8:e46966. doi: 10.7554/eLife.46966.
5
Dense connectomic reconstruction in layer 4 of the somatosensory cortex.躯体感觉皮层第 4 层的密集连接组构重建。
Science. 2019 Nov 29;366(6469). doi: 10.1126/science.aay3134. Epub 2019 Oct 24.
6
A mathematical theory of semantic development in deep neural networks.一种深度神经网络中语义发展的数学理论。
Proc Natl Acad Sci U S A. 2019 Jun 4;116(23):11537-11546. doi: 10.1073/pnas.1820226116. Epub 2019 May 17.
7
A BDNF-Mediated Push-Pull Plasticity Mechanism for Synaptic Clustering.BDNF 介导的突触簇形成的推拉式可塑性机制。
Cell Rep. 2018 Aug 21;24(8):2063-2074. doi: 10.1016/j.celrep.2018.07.073.
8
Redundancy in synaptic connections enables neurons to learn optimally.突触连接的冗余使神经元能够最优地学习。
Proc Natl Acad Sci U S A. 2018 Jul 17;115(29):E6871-E6879. doi: 10.1073/pnas.1803274115. Epub 2018 Jul 2.
9
Locally coordinated synaptic plasticity of visual cortex neurons in vivo.体内视觉皮层神经元的局部协调的突触可塑性。
Science. 2018 Jun 22;360(6395):1349-1354. doi: 10.1126/science.aao0862.
10
Dendritic Spine Elimination: Molecular Mechanisms and Implications.树突棘消除:分子机制与意义。
Neuroscientist. 2019 Feb;25(1):27-47. doi: 10.1177/1073858418769644. Epub 2018 May 2.