• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

具有大学习多样性的稀疏神经网络。

Sparse neural networks with large learning diversity.

作者信息

Gripon Vincent, Berrou Claude

机构信息

Electronics Department, Télécom Bretagne (Institut Télécom), Brest, France.

出版信息

IEEE Trans Neural Netw. 2011 Jul;22(7):1087-96. doi: 10.1109/TNN.2011.2146789. Epub 2011 Jun 7.

DOI:10.1109/TNN.2011.2146789
PMID:21652285
Abstract

Coded recurrent neural networks with three levels of sparsity are introduced. The first level is related to the size of messages that are much smaller than the number of available neurons. The second one is provided by a particular coding rule, acting as a local constraint in the neural activity. The third one is a characteristic of the low final connection density of the network after the learning phase. Though the proposed network is very simple since it is based on binary neurons and binary connections, it is able to learn a large number of messages and recall them, even in presence of strong erasures. The performance of the network is assessed as a classifier and as an associative memory.

摘要

引入了具有三个稀疏度级别的编码递归神经网络。第一个级别与比可用神经元数量小得多的消息大小相关。第二个级别由特定的编码规则提供,该规则在神经活动中充当局部约束。第三个级别是学习阶段后网络最终连接密度低的一个特征。尽管所提出的网络非常简单,因为它基于二进制神经元和二进制连接,但它能够学习大量消息并进行回忆,即使在存在强烈擦除的情况下也是如此。该网络的性能被评估为分类器和关联存储器。

相似文献

1
Sparse neural networks with large learning diversity.具有大学习多样性的稀疏神经网络。
IEEE Trans Neural Netw. 2011 Jul;22(7):1087-96. doi: 10.1109/TNN.2011.2146789. Epub 2011 Jun 7.
2
Neural associative memory with optimal Bayesian learning.最优贝叶斯学习的神经联想记忆。
Neural Comput. 2011 Jun;23(6):1393-451. doi: 10.1162/NECO_a_00127. Epub 2011 Mar 11.
3
A modular attractor associative memory with patchy connectivity and weight pruning.具有斑块连接和权值修剪的模块化吸引子联想记忆。
Network. 2013;24(4):129-50. doi: 10.3109/0954898X.2013.859323.
4
Nonbinary associative memory with exponential pattern retrieval capacity and iterative learning.具有指数模式检索能力和迭代学习的非二进制联想记忆。
IEEE Trans Neural Netw Learn Syst. 2014 Mar;25(3):557-70. doi: 10.1109/TNNLS.2013.2277608.
5
Associative memory in quaternionic Hopfield neural network.四元数霍普菲尔德神经网络中的联想记忆
Int J Neural Syst. 2008 Apr;18(2):135-45. doi: 10.1142/S0129065708001440.
6
Pseudo-relaxation learning algorithm for complex-valued associative memory.用于复值联想记忆的伪松弛学习算法。
Int J Neural Syst. 2008 Apr;18(2):147-56. doi: 10.1142/S0129065708001452.
7
A hybrid neural network of addressable and content-addressable memory.一种可寻址和内容可寻址存储器的混合神经网络。
Int J Neural Syst. 2003 Jun;13(3):205-13. doi: 10.1142/S0129065703001546.
8
Encoding binary neural codes in networks of threshold-linear neurons.在阈值线性神经元网络中对二进制神经码进行编码。
Neural Comput. 2013 Nov;25(11):2858-903. doi: 10.1162/NECO_a_00504. Epub 2013 Jul 29.
9
Sequence memory based on coherent spin-interaction neural networks.基于相干自旋相互作用神经网络的序列记忆
Neural Comput. 2014 Dec;26(12):2944-61. doi: 10.1162/NECO_a_00663. Epub 2014 Aug 22.
10
A learning rule for very simple universal approximators consisting of a single layer of perceptrons.一种由单层感知器组成的非常简单的通用逼近器的学习规则。
Neural Netw. 2008 Jun;21(5):786-95. doi: 10.1016/j.neunet.2007.12.036. Epub 2007 Dec 31.

引用本文的文献

1
Spiking representation learning for associative memories.用于关联记忆的脉冲表示学习。
Front Neurosci. 2024 Sep 19;18:1439414. doi: 10.3389/fnins.2024.1439414. eCollection 2024.
2
Attractor and integrator networks in the brain.大脑中的吸引子网络和整合器网络。
Nat Rev Neurosci. 2022 Dec;23(12):744-766. doi: 10.1038/s41583-022-00642-0. Epub 2022 Nov 3.
3
Variable Binding for Sparse Distributed Representations: Theory and Applications.变量绑定的稀疏分布式表示:理论与应用。
IEEE Trans Neural Netw Learn Syst. 2023 May;34(5):2191-2204. doi: 10.1109/TNNLS.2021.3105949. Epub 2023 May 2.
4
Technical note: an R package for fitting sparse neural networks with application in animal breeding.技术说明:用于拟合稀疏神经网络的 R 包及其在动物育种中的应用。
J Anim Sci. 2018 May 4;96(5):2016-2026. doi: 10.1093/jas/sky071.
5
Robust Exponential Memory in Hopfield Networks.霍普菲尔德网络中的稳健指数记忆
J Math Neurosci. 2018 Jan 16;8(1):1. doi: 10.1186/s13408-017-0056-2.