Alkon D L, Blackwell K T, Barbour G S, Rigler A K, Vogl T P
Laboratory for Cellular and Molecular Neurobiology, National Institute of Neurological Disorders and Stroke, National Institutes of Health, Bethesda, MD 20892.
Biol Cybern. 1990;62(5):363-76. doi: 10.1007/BF00197642.
A novel artificial neural network, derived from neurobiological observations, is described and examples of its performance are presented. This DYnamically STable Associative Learning (DYSTAL) network associatively learns both correlations and anticorrelations, and can be configured to classify or restore patterns with only a change in the number of output units. DYSTAL exhibits some particularly desirable properties: computational effort scales linearly with the number of connections, i.e., it is O(N) in complexity; performance of the network is stable with respect to network parameters over wide ranges of their values and over the size of the input field; storage of a very large number of patterns is possible; patterns need not be orthogonal; network connections are not restricted to multi-layer feed-forward or any other specific structure; and, for a known set of deterministic input patterns, the network weights can be computed, a priori, in closed form. The network has been associatively trained to perform the XOR function as well as other classification tasks. The network has also been trained to restore patterns obscured by binary or analog noise. Neither global nor local feedback connections are required during learning; hence the network is particularly suitable for hardware (VLSI) implementation.
描述了一种源自神经生物学观察的新型人工神经网络,并给出了其性能示例。这种动态稳定联想学习(DYSTAL)网络能够联想学习相关性和反相关性,并且仅通过改变输出单元的数量就可以配置为对模式进行分类或恢复。DYSTAL具有一些特别理想的特性:计算量与连接数量呈线性关系,即其复杂度为O(N);在网络参数的广泛取值范围以及输入字段大小范围内,网络性能对于网络参数是稳定的;可以存储大量模式;模式无需正交;网络连接不限于多层前馈或任何其他特定结构;并且,对于一组已知的确定性输入模式,网络权重可以先验地以封闭形式计算。该网络已经通过联想训练来执行异或函数以及其他分类任务。该网络还经过训练以恢复被二进制或模拟噪声掩盖的模式。学习过程中既不需要全局反馈连接也不需要局部反馈连接;因此该网络特别适合硬件(超大规模集成电路)实现。