Suppr超能文献

具有大学习多样性的稀疏神经网络。

Sparse neural networks with large learning diversity.

作者信息

Gripon Vincent, Berrou Claude

机构信息

Electronics Department, Télécom Bretagne (Institut Télécom), Brest, France.

出版信息

IEEE Trans Neural Netw. 2011 Jul;22(7):1087-96. doi: 10.1109/TNN.2011.2146789. Epub 2011 Jun 7.

Abstract

Coded recurrent neural networks with three levels of sparsity are introduced. The first level is related to the size of messages that are much smaller than the number of available neurons. The second one is provided by a particular coding rule, acting as a local constraint in the neural activity. The third one is a characteristic of the low final connection density of the network after the learning phase. Though the proposed network is very simple since it is based on binary neurons and binary connections, it is able to learn a large number of messages and recall them, even in presence of strong erasures. The performance of the network is assessed as a classifier and as an associative memory.

摘要

引入了具有三个稀疏度级别的编码递归神经网络。第一个级别与比可用神经元数量小得多的消息大小相关。第二个级别由特定的编码规则提供,该规则在神经活动中充当局部约束。第三个级别是学习阶段后网络最终连接密度低的一个特征。尽管所提出的网络非常简单,因为它基于二进制神经元和二进制连接,但它能够学习大量消息并进行回忆,即使在存在强烈擦除的情况下也是如此。该网络的性能被评估为分类器和关联存储器。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验