• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

Nonlinear time series analysis by neural networks: a case study.

作者信息

Saxén H

出版信息

Int J Neural Syst. 1996 May;7(2):195-201. doi: 10.1142/s0129065796000166.

DOI:10.1142/s0129065796000166
PMID:8823629
Abstract

This paper presents a neural network approach to time-series analysis of a univariate nonlinear system. Feedforward networks are studied, and an appropriate network size is determined by different criteria computed on the basis of the performance of the models on the training and test sets. The analysis and conclusions drawn are supported by studies of the phase portraits of the models. By a proper choice of network size, the problems of over-parameterization are demonstrated to be avoided. The overfitting observed for larger networks is analyzed and the underlying reasons for their worse generalization capabilities are explained. Finally, some observations are made on the approximation provided by an oversized network with weights determined by an incomplete (interrupted) training and that of the optimal-sized network.

摘要

相似文献

1
Nonlinear time series analysis by neural networks: a case study.
Int J Neural Syst. 1996 May;7(2):195-201. doi: 10.1142/s0129065796000166.
2
A pruning feedforward small-world neural network based on Katz centrality for nonlinear system modeling.基于 Katz 中心度的修剪前馈小世界神经网络用于非线性系统建模。
Neural Netw. 2020 Oct;130:269-285. doi: 10.1016/j.neunet.2020.07.017. Epub 2020 Jul 16.
3
Constructive training methods for feedforward neural networks with binary weights.具有二进制权重的前馈神经网络的建设性训练方法。
Int J Neural Syst. 1996 May;7(2):149-66. doi: 10.1142/s0129065796000129.
4
A growing and pruning sequential learning algorithm of hyper basis function neural network for function approximation.超基函数神经网络用于函数逼近的生长和修剪序贯学习算法。
Neural Netw. 2013 Oct;46:210-26. doi: 10.1016/j.neunet.2013.06.004. Epub 2013 Jun 14.
5
The target switch algorithm: a constructive learning procedure for feed-forward neural networks.目标切换算法:一种用于前馈神经网络的建设性学习过程。
Neural Comput. 1995 Nov;7(6):1245-64. doi: 10.1162/neco.1995.7.6.1245.
6
Smooth function approximation using neural networks.使用神经网络进行光滑函数逼近。
IEEE Trans Neural Netw. 2005 Jan;16(1):24-38. doi: 10.1109/TNN.2004.836233.
7
Hardware prototypes of a Boolean neural network and the simulated annealing optimization method.布尔神经网络的硬件原型及模拟退火优化方法。
Int J Neural Syst. 1996 Mar;7(1):45-52. doi: 10.1142/s0129065796000051.
8
Invariance priors for Bayesian feed-forward neural networks.贝叶斯前馈神经网络的不变性先验
Neural Netw. 2006 Dec;19(10):1550-7. doi: 10.1016/j.neunet.2006.01.017. Epub 2006 Mar 31.
9
An experimental study on nonlinear function computation for neural/fuzzy hardware design.
IEEE Trans Neural Netw. 2007 Jan;18(1):266-83. doi: 10.1109/TNN.2006.884680.
10
Adaptive Neural Tracking Control for Switched High-Order Stochastic Nonlinear Systems.切换高阶随机非线性系统的自适应神经跟踪控制
IEEE Trans Cybern. 2017 Oct;47(10):3088-3099. doi: 10.1109/TCYB.2017.2684218. Epub 2017 Mar 31.