Carpenter Gail A., Milenova Boriana L., Noeske Benjamin W.
Center for Adaptive Systems and Department of Cognitive and Neural Systems, Boston University, Boston, USA
Neural Netw. 1998 Jul;11(5):793-813. doi: 10.1016/s0893-6080(98)00019-7.
Distributed coding at the hidden layer of a multi-layer perceptron (MLP) endows the network with memory compression and noise tolerance capabilities. However, an MLP typically requires slow off-line learning to avoid catastrophic forgetting in an open input environment. An adaptive resonance theory (ART) model is designed to guarantee stable memories even with fast on-line learning. However, ART stability typically requires winner-take-all coding, which may cause category proliferation in a noisy input environment. Distributed ARTMAP (dARTMAP) seeks to combine the computational advantages of MLP and ART systems in a real-time neural network for supervised learning. An implementation algorithm here describes one class of dARTMAP networks. This system incorporates elements of the unsupervised dART model, as well as new features, including a content-addressable memory (CAM) rule for improved contrast control at the coding field. A dARTMAP system reduces to fuzzy ARTMAP when coding is winner-take-all. Simulations show that dARTMAP retains fuzzy ARTMAP accuracy while significantly improving memory compression.
多层感知器(MLP)隐藏层的分布式编码赋予网络记忆压缩和噪声容忍能力。然而,MLP通常需要缓慢的离线学习,以避免在开放输入环境中出现灾难性遗忘。自适应共振理论(ART)模型旨在即使在快速在线学习时也能保证稳定的记忆。然而,ART的稳定性通常需要胜者全得编码,这在噪声输入环境中可能会导致类别激增。分布式ARTMAP(dARTMAP)试图在用于监督学习的实时神经网络中结合MLP和ART系统的计算优势。这里的一种实现算法描述了一类dARTMAP网络。该系统结合了无监督dART模型的元素以及新特性,包括用于在编码字段改进对比度控制的内容可寻址存储器(CAM)规则。当编码为胜者全得时,dARTMAP系统简化为模糊ARTMAP。仿真表明,dARTMAP在显著提高记忆压缩的同时,保持了模糊ARTMAP的准确性。