• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

An adaptive training method for optimal interpolative neural nets.

作者信息

Liu T Z, Yen C W

机构信息

Department of Mechanical Engineering, National Sun-Yat Sen University, Kaohsiung, Taiwan.

出版信息

Int J Neural Syst. 1997 Apr;8(2):145-54. doi: 10.1142/s0129065797000173.

DOI:10.1142/s0129065797000173
PMID:9327271
Abstract

In contrast to conventional multilayered feedforward networks which are typically trained by iterative gradient search methods, an optimal interpolative (OI) net can be trained by a noniterative least squares algorithm called RLS-OI. The basic idea of RLS-OI is to use a subset of the training set, whose inputs are called subprototypes, to constrain the OI net solution. A subset of these subprototypes, called prototypes, is then chosen as the parameter vectors of the activation functions of the OI net to satisfy the subprototype constraints in the least squares (LS) sense. By dynamically increasing the numbers of subprototypes and prototypes, RLS-OI evolves the OI net from scratch to the extent sufficient to solve a given classification problem. To improve the performance of RLS-OI, this paper addresses two important problems in OI net training: the selection of the subprototypes and the selection of the prototypes. By choosing subprototypes from poorly classified regions, this paper proposes a new subprototype selection method which is adaptive to the changing classification performance of the growing OI net. This paper also proposes a new prototype selection criterion to reduce the complexity of the OI net. For the same training accuracy, simulation results demonstrate that the proposed approach produces smaller OI net than the RLS-OI algorithm. Experimental results also show that the proposed approach is less sensitive to the variation of the training set than RLS-OI.

摘要

相似文献

1
An adaptive training method for optimal interpolative neural nets.
Int J Neural Syst. 1997 Apr;8(2):145-54. doi: 10.1142/s0129065797000173.
2
An evolution-oriented learning algorithm for the optimal interpolative net.
IEEE Trans Neural Netw. 1992;3(2):315-23. doi: 10.1109/72.125873.
3
Generalized RLS approach to the training of neural networks.用于神经网络训练的广义递归最小二乘(RLS)方法。
IEEE Trans Neural Netw. 2006 Jan;17(1):19-34. doi: 10.1109/TNN.2005.860857.
4
Distributed fault tolerance in optimal interpolative nets.
IEEE Trans Neural Netw. 2001;12(6):1348-57. doi: 10.1109/72.963771.
5
Methods for pattern selection, class-specific feature selection and classification for automated learning.自动化学习中的模式选择、类别特定特征选择和分类方法。
Neural Netw. 2013 May;41:113-29. doi: 10.1016/j.neunet.2012.12.007. Epub 2013 Jan 11.
6
Fault-tolerant training for optimal interpolative nets.
IEEE Trans Neural Netw. 1995;6(6):1531-5. doi: 10.1109/72.471356.
7
Two regularizers for recursive least squared algorithms in feedforward multilayered neural networks.
IEEE Trans Neural Netw. 2001;12(6):1314-32. doi: 10.1109/72.963768.
8
An improvement of extreme learning machine for compact single-hidden-layer feedforward neural networks.用于紧凑型单隐层前馈神经网络的极限学习机改进方法。
Int J Neural Syst. 2008 Oct;18(5):433-41. doi: 10.1142/S0129065708001695.
9
A cascading structure and training method for multilayer neural networks.一种用于多层神经网络的级联结构和训练方法。
Int J Neural Syst. 1997 Oct-Dec;8(5-6):509-15. doi: 10.1142/s0129065797000495.
10
Novel maximum-margin training algorithms for supervised neural networks.用于监督神经网络的新型最大间隔训练算法。
IEEE Trans Neural Netw. 2010 Jun;21(6):972-84. doi: 10.1109/TNN.2010.2046423. Epub 2010 Apr 19.