Biologically Inspired Neural and Dynamical Systems (BINDS) Lab, Department of Computer Science, University of Massachusetts Amherst, Amherst, Massachusetts, USA.
PLoS One. 2010 Jun 11;5(6):e10955. doi: 10.1371/journal.pone.0010955.
This paper introduces a new model of associative memory, capable of both binary and continuous-valued inputs. Based on kernel theory, the memory model is on one hand a generalization of Radial Basis Function networks and, on the other, is in feature space, analogous to a Hopfield network. Attractors can be added, deleted, and updated on-line simply, without harming existing memories, and the number of attractors is independent of input dimension. Input vectors do not have to adhere to a fixed or bounded dimensionality; they can increase and decrease it without relearning previous memories. A memory consolidation process enables the network to generalize concepts and form clusters of input data, which outperforms many unsupervised clustering techniques; this process is demonstrated on handwritten digits from MNIST. Another process, reminiscent of memory reconsolidation is introduced, in which existing memories are refreshed and tuned with new inputs; this process is demonstrated on series of morphed faces.
本文提出了一种新的联想记忆模型,能够同时处理二进制和连续值输入。基于核理论,该记忆模型一方面是径向基函数网络的推广,另一方面在特征空间中类似于霍普菲尔德网络。吸引子可以在线简单地添加、删除和更新,而不会损害现有记忆,并且吸引子的数量与输入维度无关。输入向量不必遵守固定或有界的维度;它们可以在不重新学习以前记忆的情况下增加或减少维度。记忆巩固过程使网络能够对概念进行泛化,并形成输入数据的聚类,这优于许多无监督聚类技术;该过程在手写数字 MNIST 上进行了演示。引入了另一种类似于记忆再巩固的过程,其中使用新的输入刷新和调整现有记忆;该过程在一系列变形面孔上进行了演示。