Chen C M, Yang J F
Dept. of Inf. Manage. Technol., Tamsui Oxford Univ. Coll., Tainan.
IEEE Trans Syst Man Cybern B Cybern. 2000;30(1):25-30. doi: 10.1109/3477.826944.
In this paper, we propose generalized layer winner-take-all (WTA) neural networks based on the suggested full WTA networks, which can be extended from any existing WTA structure with a simple weighted-and-sum neuron. With modular regularity and local connection, the layer WTA network in either hierarchical or recursive structure is suitable for a large number of competitors. The complexity and convergence performances of layer and direct WTA neural networks are analyzed. Simulation results and theoretical analyzes verify that the layer WTA neural networks with extendibility outperform their original direct WTA structures in aspects of low complexity and fast convergence.
在本文中,我们基于所提出的全胜者通吃(WTA)网络,提出了广义层胜者通吃神经网络,它可以通过一个简单的加权求和神经元从任何现有的WTA结构扩展而来。层WTA网络具有模块化规则性和局部连接性,无论是分层结构还是递归结构,都适用于大量的竞争者。分析了层WTA神经网络和直接WTA神经网络的复杂度和收敛性能。仿真结果和理论分析验证了具有可扩展性的层WTA神经网络在低复杂度和快速收敛方面优于其原始的直接WTA结构。