Gallant S I
Coll. of Comput. Sci., Northeastern Univ., Boston, MA.
IEEE Trans Neural Netw. 1990;1(2):179-91. doi: 10.1109/72.80230.
A key task for connectionist research is the development and analysis of learning algorithms. An examination is made of several supervised learning algorithms for single-cell and network models. The heart of these algorithms is the pocket algorithm, a modification of perceptron learning that makes perceptron learning well-behaved with nonseparable training data, even if the data are noisy and contradictory. Features of these algorithms include speed algorithms fast enough to handle large sets of training data; network scaling properties, i.e. network methods scale up almost as well as single-cell models when the number of inputs is increased; analytic tractability, i.e. upper bounds on classification error are derivable; online learning, i.e. some variants can learn continually, without referring to previous data; and winner-take-all groups or choice groups, i.e. algorithms can be adapted to select one out of a number of possible classifications. These learning algorithms are suitable for applications in machine learning, pattern recognition, and connectionist expert systems.
联结主义研究的一项关键任务是学习算法的开发与分析。本文考察了几种用于单细胞和网络模型的监督学习算法。这些算法的核心是口袋算法,它是对感知器学习的一种改进,能使感知器学习在面对不可分离的训练数据时表现良好,即便数据存在噪声且相互矛盾。这些算法的特点包括:速度足够快的算法,能够处理大量训练数据;网络缩放属性,即当输入数量增加时,网络方法的扩展能力几乎与单细胞模型一样好;解析易处理性,即可以推导出分类误差的上限;在线学习,即一些变体能够持续学习,而无需参考先前的数据;以及胜者全得组或选择组,即算法能够进行调整,以便从多个可能的分类中选择一个。这些学习算法适用于机器学习、模式识别和联结主义专家系统等应用领域。