• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

学习可能只需要少量的突触精度。

Learning may need only a few bits of synaptic precision.

机构信息

Department of Applied Science and Technology, Politecnico di Torino, Corso Duca degli Abruzzi 24, I-10129 Torino, Italy.

Human Genetics Foundation-Torino, Via Nizza 52, I-10126 Torino, Italy.

出版信息

Phys Rev E. 2016 May;93(5):052313. doi: 10.1103/PhysRevE.93.052313. Epub 2016 May 27.

DOI:10.1103/PhysRevE.93.052313
PMID:27300916
Abstract

Learning in neural networks poses peculiar challenges when using discretized rather then continuous synaptic states. The choice of discrete synapses is motivated by biological reasoning and experiments, and possibly by hardware implementation considerations as well. In this paper we extend a previous large deviations analysis which unveiled the existence of peculiar dense regions in the space of synaptic states which accounts for the possibility of learning efficiently in networks with binary synapses. We extend the analysis to synapses with multiple states and generally more plausible biological features. The results clearly indicate that the overall qualitative picture is unchanged with respect to the binary case, and very robust to variation of the details of the model. We also provide quantitative results which suggest that the advantages of increasing the synaptic precision (i.e., the number of internal synaptic states) rapidly vanish after the first few bits, and therefore that, for practical applications, only few bits may be needed for near-optimal performance, consistent with recent biological findings. Finally, we demonstrate how the theoretical analysis can be exploited to design efficient algorithmic search strategies.

摘要

在使用离散而不是连续的突触状态时,神经网络中的学习带来了特殊的挑战。选择离散突触的动机是出于生物学推理和实验,也可能是出于硬件实现方面的考虑。在本文中,我们扩展了之前的大偏差分析,该分析揭示了突触状态空间中存在特殊密集区域的可能性,这解释了在具有二进制突触的网络中高效学习的可能性。我们将分析扩展到具有多个状态和更合理的生物学特征的突触。结果清楚地表明,与二进制情况相比,整体定性图不变,并且对模型细节的变化非常稳健。我们还提供了定量结果,表明增加突触精度(即内部突触状态的数量)的优势在最初的几个比特后迅速消失,因此,对于实际应用,仅需要几个比特就可以实现接近最佳的性能,这与最近的生物学发现一致。最后,我们展示了如何利用理论分析来设计有效的算法搜索策略。

相似文献

1
Learning may need only a few bits of synaptic precision.学习可能只需要少量的突触精度。
Phys Rev E. 2016 May;93(5):052313. doi: 10.1103/PhysRevE.93.052313. Epub 2016 May 27.
2
Efficient supervised learning in networks with binary synapses.具有二元突触的网络中的高效监督学习。
Proc Natl Acad Sci U S A. 2007 Jun 26;104(26):11079-84. doi: 10.1073/pnas.0700324104. Epub 2007 Jun 20.
3
Efficient Associative Computation with Discrete Synapses.基于离散突触的高效关联计算。
Neural Comput. 2016 Jan;28(1):118-86. doi: 10.1162/NECO_a_00795. Epub 2015 Nov 24.
4
Convergence of stochastic learning in perceptrons with binary synapses.具有二元突触的感知器中随机学习的收敛性。
Phys Rev E Stat Nonlin Soft Matter Phys. 2005 Jun;71(6 Pt 1):061907. doi: 10.1103/PhysRevE.71.061907. Epub 2005 Jun 16.
5
Synaptic dynamics: linear model and adaptation algorithm.突触动力学:线性模型与自适应算法。
Neural Netw. 2014 Aug;56:49-68. doi: 10.1016/j.neunet.2014.04.001. Epub 2014 Apr 28.
6
Neural associative memory with optimal Bayesian learning.最优贝叶斯学习的神经联想记忆。
Neural Comput. 2011 Jun;23(6):1393-451. doi: 10.1162/NECO_a_00127. Epub 2011 Mar 11.
7
Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses.亚优势密集簇允许离散突触神经网络进行简单学习和高计算性能。
Phys Rev Lett. 2015 Sep 18;115(12):128101. doi: 10.1103/PhysRevLett.115.128101.
8
Precise Synaptic Efficacy Alignment Suggests Potentiation Dominated Learning.精确的突触效能匹配表明增强主导学习。
Front Neural Circuits. 2016 Jan 13;9:90. doi: 10.3389/fncir.2015.00090. eCollection 2015.
9
Biologically plausible learning in neural networks: a lesson from bacterial chemotaxis.神经网络中具有生物学合理性的学习:来自细菌趋化作用的启示。
Biol Cybern. 2009 Dec;101(5-6):379-85. doi: 10.1007/s00422-009-0341-6. Epub 2009 Oct 21.
10
Nanoscale RRAM-based synaptic electronics: toward a neuromorphic computing device.基于纳米级 RRAM 的突触电子学:迈向神经形态计算器件。
Nanotechnology. 2013 Sep 27;24(38):384009. doi: 10.1088/0957-4484/24/38/384009. Epub 2013 Sep 2.

引用本文的文献

1
Shaping the learning landscape in neural networks around wide flat minima.围绕宽而平坦的极小值塑造神经网络的学习景观。
Proc Natl Acad Sci U S A. 2020 Jan 7;117(1):161-170. doi: 10.1073/pnas.1908636117. Epub 2019 Dec 23.
2
Efficiency of quantum vs. classical annealing in nonconvex learning problems.量子退火与经典退火在非凸学习问题中的效率比较。
Proc Natl Acad Sci U S A. 2018 Feb 13;115(7):1457-1462. doi: 10.1073/pnas.1711456115. Epub 2018 Jan 30.
3
Unreasonable effectiveness of learning neural networks: From accessible states and robust ensembles to basic algorithmic schemes.
学习神经网络的不合理有效性:从可达状态、稳健集成到基本算法方案
Proc Natl Acad Sci U S A. 2016 Nov 29;113(48):E7655-E7662. doi: 10.1073/pnas.1608103113. Epub 2016 Nov 15.