Suppr超能文献

吸引子神经网络中的信息与拓扑结构。

Information and topology in attractor neural networks.

作者信息

Dominguez D, Koroutchev K, Serrano E, Rodríguez F B

出版信息

Neural Comput. 2007 Apr;19(4):956-73. doi: 10.1162/neco.2007.19.4.956.

Abstract

A wide range of networks, including those with small-world topology, can be modeled by the connectivity ratio and randomness of the links. Both learning and attractor abilities of a neural network can be measured by the mutual information (MI) as a function of the load and the overlap between patterns and retrieval states. In this letter, we use MI to search for the optimal topology with regard to the storage and attractor properties of the network in an Amari-Hopfield model. We find that while an optimal storage implies an extremely diluted topology, a large basin of attraction leads to moderate levels of connectivity. This optimal topology is related to the clustering and path length of the network. We also build a diagram for the dynamical phases with random or local initial overlap and show that very diluted networks lose their attractor ability.

摘要

包括具有小世界拓扑结构的网络在内的各种网络,可以通过链路的连通率和随机性来建模。神经网络的学习能力和吸引子能力都可以用互信息(MI)来衡量,互信息是负载以及模式与检索状态之间重叠程度的函数。在这封信中,我们在阿马里 - 霍普菲尔德模型中使用互信息来寻找关于网络存储和吸引子特性的最优拓扑结构。我们发现,虽然最优存储意味着极其稀疏的拓扑结构,但大的吸引域会导致适度的连通水平。这种最优拓扑结构与网络的聚类和路径长度有关。我们还构建了一个具有随机或局部初始重叠的动态相图,并表明非常稀疏的网络会失去其吸引子能力。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验