• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于切比雪夫多项式的统一模型神经网络用于函数逼近。

The Chebyshev-polynomials-based unified model neural networks for function approximation.

作者信息

Lee T T, Jeng J T

机构信息

Dept. of Electr. Eng., Nat. Taiwan Inst. of Technol., Taipei.

出版信息

IEEE Trans Syst Man Cybern B Cybern. 1998;28(6):925-35. doi: 10.1109/3477.735405.

DOI:10.1109/3477.735405
PMID:18256014
Abstract

In this paper, we propose the approximate transformable technique, which includes the direct transformation and indirect transformation, to obtain a Chebyshev-Polynomials-Based (CPB) unified model neural networks for feedforward/recurrent neural networks via Chebyshev polynomials approximation. Based on this approximate transformable technique, we have derived the relationship between the single-layer neural networks and multilayer perceptron neural networks. It is shown that the CPB unified model neural networks can be represented as a functional link networks that are based on Chebyshev polynomials, and those networks use the recursive least square method with forgetting factor as learning algorithm. It turns out that the CPB unified model neural networks not only has the same capability of universal approximator, but also has faster learning speed than conventional feedforward/recurrent neural networks. Furthermore, we have also derived the condition such that the unified model generating by Chebyshev polynomials is optimal in the sense of error least square approximation in the single variable ease. Computer simulations show that the proposed method does have the capability of universal approximator in some functional approximation with considerable reduction in learning time.

摘要

在本文中,我们提出了近似可变换技术,该技术包括直接变换和间接变换,通过切比雪夫多项式逼近为前馈/递归神经网络获得基于切比雪夫多项式(CPB)的统一模型神经网络。基于这种近似可变换技术,我们推导了单层神经网络与多层感知器神经网络之间的关系。结果表明,CPB统一模型神经网络可以表示为基于切比雪夫多项式的函数链接网络,并且这些网络使用带遗忘因子的递归最小二乘法作为学习算法。结果表明,CPB统一模型神经网络不仅具有与通用逼近器相同的能力,而且比传统的前馈/递归神经网络具有更快的学习速度。此外,我们还推导了在单变量情况下,切比雪夫多项式生成的统一模型在误差最小二乘逼近意义下最优的条件。计算机仿真表明,所提出的方法在一些函数逼近中确实具有通用逼近器的能力,并且学习时间有显著减少。

相似文献

1
The Chebyshev-polynomials-based unified model neural networks for function approximation.基于切比雪夫多项式的统一模型神经网络用于函数逼近。
IEEE Trans Syst Man Cybern B Cybern. 1998;28(6):925-35. doi: 10.1109/3477.735405.
2
Control of magnetic bearing systems via the Chebyshev polynomial-based unified model (CPBUM) neural network.
IEEE Trans Syst Man Cybern B Cybern. 2000;30(1):85-92. doi: 10.1109/3477.826949.
3
Nonlinear dynamic system identification using Chebyshev functional link artificial neural networks.基于切比雪夫函数链接人工神经网络的非线性动态系统辨识
IEEE Trans Syst Man Cybern B Cybern. 2002;32(4):505-11. doi: 10.1109/TSMCB.2002.1018769.
4
A learning rule for very simple universal approximators consisting of a single layer of perceptrons.一种由单层感知器组成的非常简单的通用逼近器的学习规则。
Neural Netw. 2008 Jun;21(5):786-95. doi: 10.1016/j.neunet.2007.12.036. Epub 2007 Dec 31.
5
Some comparisons of complexity in dictionary-based and linear computational models.基于字典和线性计算模型的复杂性比较。
Neural Netw. 2011 Mar;24(2):171-82. doi: 10.1016/j.neunet.2010.10.002. Epub 2010 Nov 19.
6
Local coupled feedforward neural network.局部耦合前馈神经网络。
Neural Netw. 2010 Jan;23(1):108-13. doi: 10.1016/j.neunet.2009.06.016. Epub 2009 Jun 30.
7
Specification of training sets and the number of hidden neurons for multilayer perceptrons.多层感知器训练集的规范及隐藏神经元的数量
Neural Comput. 2001 Dec;13(12):2673-80. doi: 10.1162/089976601317098484.
8
Channel selection and classification of electroencephalogram signals: an artificial neural network and genetic algorithm-based approach.脑电信号的通道选择与分类:基于人工神经网络和遗传算法的方法。
Artif Intell Med. 2012 Jun;55(2):117-26. doi: 10.1016/j.artmed.2012.02.001. Epub 2012 Apr 12.
9
Pipelined chebyshev functional link artificial recurrent neural network for nonlinear adaptive filter.用于非线性自适应滤波器的流水线切比雪夫函数链接人工递归神经网络
IEEE Trans Syst Man Cybern B Cybern. 2010 Feb;40(1):162-72. doi: 10.1109/TSMCB.2009.2024313. Epub 2009 Sep 11.
10
A learning algorithm for adaptive canonical correlation analysis of several data sets.一种用于多个数据集自适应典型相关分析的学习算法。
Neural Netw. 2007 Jan;20(1):139-52. doi: 10.1016/j.neunet.2006.09.011. Epub 2006 Nov 17.

引用本文的文献

1
Neural network backstepping control of OWC wave energy system.振荡水柱波能系统的神经网络反步法控制
Sci Rep. 2025 Mar 7;15(1):7983. doi: 10.1038/s41598-025-87725-x.
2
Preserving differential privacy in convolutional deep belief networks.在卷积深度信念网络中保持差分隐私
Mach Learn. 2017 Oct;106(9-10):1681-1704. doi: 10.1007/s10994-017-5656-2. Epub 2017 Jul 13.