• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于 Volterra 滤波和主成分分析的扩展回声状态网络。

An extended echo state network using Volterra filtering and principal component analysis.

机构信息

DCA/FEEC/Unicamp, University of Campinas, Av. Albert Einstein, 400, 13083-852, Campinas, SP, Brazil.

出版信息

Neural Netw. 2012 Aug;32:292-302. doi: 10.1016/j.neunet.2012.02.028. Epub 2012 Feb 16.

DOI:10.1016/j.neunet.2012.02.028
PMID:22386782
Abstract

Echo state networks (ESNs) can be interpreted as promoting an encouraging compromise between two seemingly conflicting objectives: (i) simplicity of the resulting mathematical model and (ii) capability to express a wide range of nonlinear dynamics. By imposing fixed weights to the recurrent connections, the echo state approach avoids the well-known difficulties faced by recurrent neural network training strategies, but still preserves, to a certain extent, the potential of the underlying structure due to the existence of feedback loops within the dynamical reservoir. Moreover, the overall training process is relatively simple, as it amounts essentially to adapting the readout, which usually corresponds to a linear combiner. However, the linear nature of the output layer may limit the capability of exploring the available information, since higher-order statistics of the signals are not taken into account. In this work, we present a novel architecture for an ESN in which the linear combiner is replaced by a Volterra filter structure. Additionally, the principal component analysis technique is used to reduce the number of effective signals transmitted to the output layer. This idea not only improves the processing capability of the network, but also preserves the simplicity of the training process. The proposed architecture is then analyzed in the context of a set of representative information extraction problems, more specifically supervised and unsupervised channel equalization, and blind separation of convolutive mixtures. The obtained results, when compared to those produced by already proposed ESN versions, highlight the benefits brought by the novel network proposal and characterize it as a promising tool to deal with challenging signal processing tasks.

摘要

回声状态网络(ESN)可以被解释为在两个看似矛盾的目标之间做出了令人鼓舞的妥协:(i)所得数学模型的简单性和(ii)表达广泛的非线性动力学的能力。通过对递归连接施加固定权重,回声状态方法避免了递归神经网络训练策略所面临的众所周知的困难,但由于动态储层中存在反馈回路,仍在一定程度上保留了基础结构的潜力。此外,整个训练过程相对简单,因为它主要涉及调整读出,通常对应于线性组合器。然而,输出层的线性性质可能会限制探索可用信息的能力,因为信号的高阶统计信息未被考虑。在这项工作中,我们提出了一种新型 ESN 架构,其中线性组合器由 Volterra 滤波器结构取代。此外,主成分分析技术用于减少传输到输出层的有效信号数量。这个想法不仅提高了网络的处理能力,而且还保留了训练过程的简单性。然后,在所提出的架构的背景下,分析了一组具有代表性的信息提取问题,特别是监督和非监督信道均衡以及卷积混合的盲分离。与已经提出的 ESN 版本相比,所获得的结果突出了新网络提案带来的好处,并将其描述为处理具有挑战性的信号处理任务的有前途的工具。

相似文献

1
An extended echo state network using Volterra filtering and principal component analysis.基于 Volterra 滤波和主成分分析的扩展回声状态网络。
Neural Netw. 2012 Aug;32:292-302. doi: 10.1016/j.neunet.2012.02.028. Epub 2012 Feb 16.
2
Echo state networks with filter neurons and a delay&sum readout.带滤波神经元和延迟求和读出的回声状态网络。
Neural Netw. 2010 Mar;23(2):244-56. doi: 10.1016/j.neunet.2009.07.004. Epub 2009 Jul 16.
3
Recurrent kernel machines: computing with infinite echo state networks.递归核机器:使用无限回声状态网络进行计算。
Neural Comput. 2012 Jan;24(1):104-33. doi: 10.1162/NECO_a_00200. Epub 2011 Aug 18.
4
An augmented echo state network for nonlinear adaptive filtering of complex noncircular signals.一种用于复杂非循环信号非线性自适应滤波的增强回声状态网络。
IEEE Trans Neural Netw. 2011 Jan;22(1):74-83. doi: 10.1109/TNN.2010.2085444. Epub 2010 Nov 11.
5
Balanced echo state networks.平衡的回声状态网络。
Neural Netw. 2012 Dec;36:35-45. doi: 10.1016/j.neunet.2012.08.008. Epub 2012 Sep 11.
6
A novel nonlinear adaptive filter using a pipelined second-order Volterra recurrent neural network.一种使用流水线二阶 Volterra 递归神经网络的新型非线性自适应滤波器。
Neural Netw. 2009 Dec;22(10):1471-83. doi: 10.1016/j.neunet.2009.05.010. Epub 2009 May 27.
7
Regularized variational Bayesian learning of echo state networks with delay&sum readout.带延迟求和读出的回声状态网络正则化变分贝叶斯学习。
Neural Comput. 2012 Apr;24(4):967-95. doi: 10.1162/NECO_a_00253. Epub 2011 Dec 14.
8
Architectural and Markovian factors of echo state networks.回声状态网络的结构和马尔可夫因素。
Neural Netw. 2011 Jun;24(5):440-56. doi: 10.1016/j.neunet.2011.02.002. Epub 2011 Feb 13.
9
MISEP method for postnonlinear blind source separation.用于后非线性盲源分离的MISEP方法。
Neural Comput. 2007 Sep;19(9):2557-78. doi: 10.1162/neco.2007.19.9.2557.
10
Nonlinear complex-valued extensions of Hebbian learning: an essay.赫布学习的非线性复值扩展:一篇论文。
Neural Comput. 2005 Apr;17(4):779-838. doi: 10.1162/0899766053429381.

引用本文的文献

1
Fully analogue photonic reservoir computer.全模拟光子储层计算机。
Sci Rep. 2016 Mar 3;6:22381. doi: 10.1038/srep22381.