• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

具有二元突触的感知器中随机学习的收敛性。

Convergence of stochastic learning in perceptrons with binary synapses.

作者信息

Senn Walter, Fusi Stefano

机构信息

Department of Physiology, University of Bern, CH-3012 Bern, Switzerland.

出版信息

Phys Rev E Stat Nonlin Soft Matter Phys. 2005 Jun;71(6 Pt 1):061907. doi: 10.1103/PhysRevE.71.061907. Epub 2005 Jun 16.

DOI:10.1103/PhysRevE.71.061907
PMID:16089765
Abstract

The efficacy of a biological synapse is naturally bounded, and at some resolution, and is discrete at the latest level of single vesicles. The finite number of synaptic states dramatically reduce the storage capacity of a network when online learning is considered (i.e., the synapses are immediately modified by each pattern): the trace of old memories decays exponentially with the number of new memories (palimpsest property). Moreover, finding the discrete synaptic strengths which enable the classification of linearly separable patterns is a combinatorially hard problem known to be NP complete. In this paper we show that learning with discrete (binary) synapses is nevertheless possible with high probability if a randomly selected fraction of synapses is modified following each stimulus presentation (slow stochastic learning). As an additional constraint, the synapses are only changed if the output neuron does not give the desired response, as in the case of classical perceptron learning. We prove that for linearly separable classes of patterns the stochastic learning algorithm converges with arbitrary high probability in a finite number of presentations, provided that the number of neurons encoding the patterns is large enough. The stochastic learning algorithm is successfully applied to a standard classification problem of nonlinearly separable patterns by using multiple, stochastically independent output units, with an achieved performance which is comparable to the maximal ones reached for the task.

摘要

生物突触的效能自然是有界的,在某种分辨率下,并且在单个囊泡的最新层面上是离散的。当考虑在线学习时(即,突触会被每个模式立即修改),有限数量的突触状态会显著降低网络的存储容量:旧记忆的痕迹会随着新记忆的数量呈指数衰减(重写本属性)。此外,找到能够对线性可分模式进行分类的离散突触强度是一个已知为NP完全的组合难题。在本文中,我们表明,如果在每次刺激呈现后修改随机选择的一部分突触(缓慢随机学习),那么使用离散(二进制)突触进行学习仍然很有可能。作为一个额外的约束条件,如同经典感知器学习的情况一样,只有当输出神经元没有给出期望的响应时,突触才会改变。我们证明,对于线性可分的模式类别,只要编码这些模式的神经元数量足够大,随机学习算法在有限次数的呈现中以任意高的概率收敛。通过使用多个随机独立的输出单元,随机学习算法成功地应用于非线性可分模式的标准分类问题,所达到的性能与该任务所能达到的最佳性能相当。

相似文献

1
Convergence of stochastic learning in perceptrons with binary synapses.具有二元突触的感知器中随机学习的收敛性。
Phys Rev E Stat Nonlin Soft Matter Phys. 2005 Jun;71(6 Pt 1):061907. doi: 10.1103/PhysRevE.71.061907. Epub 2005 Jun 16.
2
Learning only when necessary: better memories of correlated patterns in networks with bounded synapses.仅在必要时学习:具有有限突触的网络中对相关模式的更好记忆。
Neural Comput. 2005 Oct;17(10):2106-38. doi: 10.1162/0899766054615644.
3
A learning rule for very simple universal approximators consisting of a single layer of perceptrons.一种由单层感知器组成的非常简单的通用逼近器的学习规则。
Neural Netw. 2008 Jun;21(5):786-95. doi: 10.1016/j.neunet.2007.12.036. Epub 2007 Dec 31.
4
Synaptic dynamics: linear model and adaptation algorithm.突触动力学:线性模型与自适应算法。
Neural Netw. 2014 Aug;56:49-68. doi: 10.1016/j.neunet.2014.04.001. Epub 2014 Apr 28.
5
A fast and convergent stochastic MLP learning algorithm.一种快速收敛的随机多层感知器学习算法。
Int J Neural Syst. 2001 Dec;11(6):573-83. doi: 10.1142/S0129065701000977.
6
Efficient supervised learning in networks with binary synapses.具有二元突触的网络中的高效监督学习。
Proc Natl Acad Sci U S A. 2007 Jun 26;104(26):11079-84. doi: 10.1073/pnas.0700324104. Epub 2007 Jun 20.
7
Spiking perceptrons.脉冲感知器
IEEE Trans Neural Netw. 2006 May;17(3):803-7. doi: 10.1109/TNN.2006.873274.
8
Learning by message passing in networks of discrete synapses.通过离散突触网络中的消息传递进行学习。
Phys Rev Lett. 2006 Jan 27;96(3):030201. doi: 10.1103/PhysRevLett.96.030201. Epub 2006 Jan 25.
9
A stochastic population approach to the problem of stable recruitment hierarchies in spiking neural networks.一种用于解决脉冲神经网络中稳定招募层次结构问题的随机种群方法。
Biol Cybern. 2006 Jan;94(1):33-45. doi: 10.1007/s00422-005-0023-y. Epub 2005 Nov 10.
10
On the classification capability of sign-constrained perceptrons.关于符号约束感知机的分类能力
Neural Comput. 2008 Jan;20(1):288-309. doi: 10.1162/neco.2008.20.1.288.

引用本文的文献

1
Chalcogenide Ovonic Threshold Switching Selector.硫族化物氧阈开关选择器
Nanomicro Lett. 2024 Jan 11;16(1):81. doi: 10.1007/s40820-023-01289-x.
2
Dynamical Characteristics of Recurrent Neuronal Networks Are Robust Against Low Synaptic Weight Resolution.递归神经网络的动力学特性对低突触权重分辨率具有鲁棒性。
Front Neurosci. 2021 Dec 24;15:757790. doi: 10.3389/fnins.2021.757790. eCollection 2021.
3
Neuromorphic Spintronics.神经形态自旋电子学
Nat Electron. 2020;3(7). doi: 10.1038/s41928-019-0360-9.
4
unsupervised learning using stochastic switching in magneto-electric magnetic tunnel junctions.基于随机切换的无监督学习在磁电混合磁隧道结中的应用。
Philos Trans A Math Phys Eng Sci. 2020 Feb 7;378(2164):20190157. doi: 10.1098/rsta.2019.0157. Epub 2019 Dec 23.
5
Spintronic Nanodevices for Bioinspired Computing.用于生物启发计算的自旋电子纳米器件。
Proc IEEE Inst Electr Electron Eng. 2016 Oct;104(10):2024-2039. doi: 10.1109/JPROC.2016.2597152. Epub 2016 Sep 8.
6
Multiclass Classification by Adaptive Network of Dendritic Neurons with Binary Synapses Using Structural Plasticity.使用结构可塑性的具有二元突触的树突神经元自适应网络进行多类分类
Front Neurosci. 2016 Mar 31;10:113. doi: 10.3389/fnins.2016.00113. eCollection 2016.
7
Stochastic learning in oxide binary synaptic device for neuromorphic computing.氧化物二进制突触器件的随机学习用于神经形态计算。
Front Neurosci. 2013 Oct 31;7:186. doi: 10.3389/fnins.2013.00186. eCollection 2013.
8
Recurrent network of perceptrons with three state synapses achieves competitive classification on real inputs.具有三态突触的递归感知机网络可实现对真实输入的竞争分类。
Front Comput Neurosci. 2012 Jun 22;6:39. doi: 10.3389/fncom.2012.00039. eCollection 2012.
9
Efficient supervised learning in networks with binary synapses.具有二元突触的网络中的高效监督学习。
Proc Natl Acad Sci U S A. 2007 Jun 26;104(26):11079-84. doi: 10.1073/pnas.0700324104. Epub 2007 Jun 20.
10
Stability of discrete memory states to stochastic fluctuations in neuronal systems.离散记忆状态对神经元系统中随机波动的稳定性。
Chaos. 2006 Jun;16(2):026109. doi: 10.1063/1.2208923.