Suppr超能文献

具有任意激活函数的神经网络对非线性算子的通用逼近及其在动力系统中的应用。

Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems.

作者信息

Chen T, Chen H

机构信息

Dept. of Math., Fudan Univ., Shanghai.

出版信息

IEEE Trans Neural Netw. 1995;6(4):911-7. doi: 10.1109/72.392253.

Abstract

The purpose of this paper is to investigate neural network capability systematically. The main results are: 1) every Tauber-Wiener function is qualified as an activation function in the hidden layer of a three-layered neural network; 2) for a continuous function in S'(R(1 )) to be a Tauber-Wiener function, the necessary and sufficient condition is that it is not a polynomial; 3) the capability of approximating nonlinear functionals defined on some compact set of a Banach space and nonlinear operators has been shown; and 4) the possibility by neural computation to approximate the output as a whole (not at a fixed point) of a dynamical system, thus identifying the system.

摘要

本文的目的是系统地研究神经网络的能力。主要结果如下:1)每个陶伯 - 维纳函数都可作为三层神经网络隐藏层中的激活函数;2)对于(S'(R(1)))中的连续函数成为陶伯 - 维纳函数,充要条件是它不是多项式;3)已证明神经网络具有逼近定义在巴拿赫空间某些紧集上的非线性泛函和非线性算子的能力;4)通过神经计算有可能逼近动力系统的整体输出(而非在固定点处),从而识别该系统。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验