• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于嵌套最优正则化的正交前向回归的非线性辨识。

Nonlinear Identification Using Orthogonal Forward Regression With Nested Optimal Regularization.

出版信息

IEEE Trans Cybern. 2015 Dec;45(12):2925-36. doi: 10.1109/TCYB.2015.2389524. Epub 2015 Jan 27.

DOI:10.1109/TCYB.2015.2389524
PMID:25643422
Abstract

An efficient data based-modeling algorithm for nonlinear system identification is introduced for radial basis function (RBF) neural networks with the aim of maximizing generalization capability based on the concept of leave-one-out (LOO) cross validation. Each of the RBF kernels has its own kernel width parameter and the basic idea is to optimize the multiple pairs of regularization parameters and kernel widths, each of which is associated with a kernel, one at a time within the orthogonal forward regression (OFR) procedure. Thus, each OFR step consists of one model term selection based on the LOO mean square error (LOOMSE), followed by the optimization of the associated kernel width and regularization parameter, also based on the LOOMSE. Since like our previous state-of-the-art local regularization assisted orthogonal least squares (LROLS) algorithm, the same LOOMSE is adopted for model selection, our proposed new OFR algorithm is also capable of producing a very sparse RBF model with excellent generalization performance. Unlike our previous LROLS algorithm which requires an additional iterative loop to optimize the regularization parameters as well as an additional procedure to optimize the kernel width, the proposed new OFR algorithm optimizes both the kernel widths and regularization parameters within the single OFR procedure, and consequently the required computational complexity is dramatically reduced. Nonlinear system identification examples are included to demonstrate the effectiveness of this new approach in comparison to the well-known approaches of support vector machine and least absolute shrinkage and selection operator as well as the LROLS algorithm.

摘要

引入了一种基于留一法(LOO)交叉验证的概念,用于最大化泛化能力的径向基函数(RBF)神经网络的非线性系统辨识的有效数据建模算法。每个 RBF 核都有自己的核宽度参数,其基本思想是在正交前向回归(OFR)过程中,每次优化多对正则化参数和核宽度对,其中每个核与一个核相关联。因此,每个 OFR 步骤都包括基于 LOO 均方误差(LOOMSE)的模型项选择,然后基于 LOOMSE 优化相关的核宽度和正则化参数。由于与我们之前的最先进的局部正则化辅助正交最小二乘(LROLS)算法一样,相同的 LOOMSE 用于模型选择,因此我们提出的新 OFR 算法也能够生成具有出色泛化性能的非常稀疏的 RBF 模型。与我们之前的 LROLS 算法不同,该算法需要额外的迭代循环来优化正则化参数以及优化核宽度的额外过程,所提出的新 OFR 算法在单个 OFR 过程中同时优化核宽度和正则化参数,因此所需的计算复杂度大大降低。包含非线性系统识别示例,以证明与支持向量机、最小绝对收缩和选择算子以及 LROLS 算法等知名方法相比,该新方法的有效性。

相似文献

1
Nonlinear Identification Using Orthogonal Forward Regression With Nested Optimal Regularization.基于嵌套最优正则化的正交前向回归的非线性辨识。
IEEE Trans Cybern. 2015 Dec;45(12):2925-36. doi: 10.1109/TCYB.2015.2389524. Epub 2015 Jan 27.
2
Sparse kernel density construction using orthogonal forward regression with leave-one-out test score and local regularization.使用带留一法检验分数和局部正则化的正交前向回归进行稀疏核密度构建。
IEEE Trans Syst Man Cybern B Cybern. 2004 Aug;34(4):1708-17. doi: 10.1109/tsmcb.2004.828199.
3
Sparse kernel learning with LASSO and Bayesian inference algorithm.基于 LASSO 和贝叶斯推断算法的稀疏核学习。
Neural Netw. 2010 Mar;23(2):257-64. doi: 10.1016/j.neunet.2009.07.001. Epub 2009 Jul 9.
4
Construction of tunable radial basis function networks using orthogonal forward selection.使用正交前向选择构建可调径向基函数网络。
IEEE Trans Syst Man Cybern B Cybern. 2009 Apr;39(2):457-66. doi: 10.1109/TSMCB.2008.2006688. Epub 2008 Dec 16.
5
Sparse modeling using orthogonal forward regression with PRESS statistic and regularization.使用带PRESS统计量和正则化的正交向前回归进行稀疏建模。
IEEE Trans Syst Man Cybern B Cybern. 2004 Apr;34(2):898-911. doi: 10.1109/tsmcb.2003.817107.
6
Probability density estimation with tunable kernels using orthogonal forward regression.使用正交前向回归的可调核概率密度估计
IEEE Trans Syst Man Cybern B Cybern. 2010 Aug;40(4):1101-14. doi: 10.1109/TSMCB.2009.2034732. Epub 2009 Dec 15.
7
Automatic kernel regression modelling using combined leave-one-out test score and regularised orthogonal least squares.使用留一法检验分数与正则化正交最小二乘法相结合的自动核回归建模
Int J Neural Syst. 2004 Feb;14(1):27-37. doi: 10.1142/S0129065704001875.
8
Combined genetic algorithm optimization and regularized orthogonal least squares learning for radial basis function networks.结合遗传算法优化与正则化正交最小二乘法学习的径向基函数网络
IEEE Trans Neural Netw. 1999;10(5):1239-43. doi: 10.1109/72.788663.
9
A new discrete-continuous algorithm for radial basis function networks construction.一种新的用于构建径向基函数网络的离散连续算法。
IEEE Trans Neural Netw Learn Syst. 2013 Nov;24(11):1785-98. doi: 10.1109/TNNLS.2013.2264292.
10
Gene selection in cancer classification using sparse logistic regression with Bayesian regularization.使用带贝叶斯正则化的稀疏逻辑回归进行癌症分类中的基因选择。
Bioinformatics. 2006 Oct 1;22(19):2348-55. doi: 10.1093/bioinformatics/btl386. Epub 2006 Jul 14.