Gripon Vincent, Berrou Claude
Electronics Department, Télécom Bretagne (Institut Télécom), Brest, France.
IEEE Trans Neural Netw. 2011 Jul;22(7):1087-96. doi: 10.1109/TNN.2011.2146789. Epub 2011 Jun 7.
Coded recurrent neural networks with three levels of sparsity are introduced. The first level is related to the size of messages that are much smaller than the number of available neurons. The second one is provided by a particular coding rule, acting as a local constraint in the neural activity. The third one is a characteristic of the low final connection density of the network after the learning phase. Though the proposed network is very simple since it is based on binary neurons and binary connections, it is able to learn a large number of messages and recall them, even in presence of strong erasures. The performance of the network is assessed as a classifier and as an associative memory.
引入了具有三个稀疏度级别的编码递归神经网络。第一个级别与比可用神经元数量小得多的消息大小相关。第二个级别由特定的编码规则提供,该规则在神经活动中充当局部约束。第三个级别是学习阶段后网络最终连接密度低的一个特征。尽管所提出的网络非常简单,因为它基于二进制神经元和二进制连接,但它能够学习大量消息并进行回忆,即使在存在强烈擦除的情况下也是如此。该网络的性能被评估为分类器和关联存储器。