Suppr超能文献

上下文相关神经网络——结构与学习

Context-dependent neural nets--structures and learning.

作者信息

Ciskowski Piotr, Rafajłowicz Ewaryst

机构信息

Wroclaw University of Technology, 50 370 Wrocław, Poland.

出版信息

IEEE Trans Neural Netw. 2004 Nov;15(6):1367-77. doi: 10.1109/TNN.2004.837839.

Abstract

A novel approach toward neural networks modeling is presented in the paper. It is unique in the fact that allows nets' weights to change according to changes of some environmental factors even after completing the learning process. The models of context-dependent (cd) neuron, one- and multilayer feedforward net are presented, with basic learning algorithms and examples of functioning. The Vapnik-Chervonenkis (VC) dimension of a cd neuron is derived, as well as VC dimension of multilayer feedforward nets. Cd nets' properties are discussed and compared with the properties of traditional nets. Possibilities of applications to classification and control problems are also outlined and an example presented.

摘要

本文提出了一种神经网络建模的新方法。它的独特之处在于,即使在学习过程完成后,也能使网络权重根据某些环境因素的变化而改变。文中给出了上下文相关(cd)神经元、单隐藏层和多隐藏层前馈网络的模型,以及基本学习算法和运行示例。推导了cd神经元的Vapnik-Chervonenkis(VC)维数,以及多隐藏层前馈网络的VC维数。讨论了cd网络的特性,并与传统网络的特性进行了比较。还概述了其在分类和控制问题中的应用可能性,并给出了一个示例。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验