Gosti Giorgio, Folli Viola, Leonetti Marco, Ruocco Giancarlo
Center for Life Nanoscience, Istituto Italiano di Tecnologia, Viale Regina Elena 291, 00161 Rome, Italy.
CNR NANOTEC-Institute of Nanotechnology c/o Campus Ecotekne, University of Salento, Via Monteroni, 73100 Lecce, Italy.
Entropy (Basel). 2019 Jul 25;21(8):726. doi: 10.3390/e21080726.
In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an -node Hopfield neural network with autapses, the number of stored patterns () is not limited to the well known bound 0.14 N , as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the 0.14 N threshold, for much greater than , the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size.
在神经网络中,自突触是一种将神经元与其自身相连的特殊突触。无论是在人工神经网络还是生物神经网络中,自突触几乎总是不被允许的。此外,冗余或相似的存储状态往往会产生破坏性的相互作用。本文展示了自突触与稳定状态冗余如何能够提高递归神经网络的存储容量。最近的研究表明,在具有自突触的N节点霍普菲尔德神经网络中,存储模式的数量(M)并不局限于众所周知的界限0.14N,而对于没有自突触的网络则是如此。更确切地说,它描述了随着存储模式的数量大幅超过0.14N阈值,当M远大于N时,检索误差渐近地趋近于一个低于单位值的值。因此,检索误差的减小使得存储的记忆数量大大超过了之前认为可能的数量。不幸的是,不久之后,新的结果表明,在热力学极限下,对于处于这种高存储状态且具有自突触的网络,存储记忆的吸引盆会收缩到单个状态。这意味着,对于与存储记忆相关的每个稳定状态,即使初始模式中出现单个比特错误,也会导致系统进入与不同记忆状态相关的稳态。因此,这限制了这种霍普菲尔德网络作为联想记忆的潜在用途。本文提出了一种策略来克服这一限制,即通过改善霍普菲尔德神经网络的纠错特性。所提出的策略使我们能够形成围绕每个存储记忆的所谓状态吸收邻域。吸收邻域是由围绕网络状态的汉明距离定义的集合,它之所以是吸收性的,是因为在长时间极限下,其中的状态会被该集合中的稳定状态吸收。我们表明,这种策略允许网络存储指数数量的记忆模式,每个记忆模式都被一个大小呈指数增长的吸收邻域所包围。