IEEE Trans Neural Netw Learn Syst. 2014 Mar;25(3):557-70. doi: 10.1109/TNNLS.2013.2277608.
We consider the problem of neural association for a network of nonbinary neurons. Here, the task is to first memorize a set of patterns using a network of neurons whose states assume values from a finite number of integer levels. Later, the same network should be able to recall the previously memorized patterns from their noisy versions. Prior work in this area consider storing a finite number of purely random patterns, and have shown that the pattern retrieval capacities (maximum number of patterns that can be memorized) scale only linearly with the number of neurons in the network. In our formulation of the problem, we concentrate on exploiting redundancy and internal structure of the patterns to improve the pattern retrieval capacity. Our first result shows that if the given patterns have a suitable linear-algebraic structure, i.e., comprise a subspace of the set of all possible patterns, then the pattern retrieval capacity is exponential in terms of the number of neurons. The second result extends the previous finding to cases where the patterns have weak minor components, i.e., the smallest eigenvalues of the correlation matrix tend toward zero. We will use these minor components (or the basis vectors of the pattern null space) to increase both the pattern retrieval capacity and error correction capabilities. An iterative algorithm is proposed for the learning phase, and two simple algorithms are presented for the recall phase. Using analytical methods and simulations, we show that the proposed methods can tolerate a fair amount of errors in the input while being able to memorize an exponentially large number of patterns.
我们研究了非二进制神经元网络的神经关联问题。在这里,任务是首先使用一个神经元网络来记忆一组模式,其状态取值来自有限个整数级别。然后,该网络应该能够从其噪声版本中回忆起之前记忆的模式。该领域的先前工作考虑存储有限数量的纯随机模式,并表明模式检索能力(可以记忆的模式的最大数量)仅与网络中的神经元数量呈线性关系。在我们对问题的表述中,我们专注于利用模式的冗余和内部结构来提高模式检索能力。我们的第一个结果表明,如果给定的模式具有适当的线性代数结构,即包含所有可能模式的集合中的一个子空间,则模式检索能力是神经元数量的指数级。第二个结果将先前的发现扩展到模式具有较弱的次要分量的情况,即相关矩阵的最小特征值趋于零。我们将使用这些次要分量(或模式零空间的基向量)来提高模式检索能力和纠错能力。提出了一种用于学习阶段的迭代算法,并提出了两种用于回忆阶段的简单算法。使用分析方法和模拟,我们表明,所提出的方法可以在输入中容忍相当多的错误,同时能够记忆大量指数级的模式。