• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

一种用于最小化储层计算的新方法。

A novel approach to minimal reservoir computing.

作者信息

Ma Haochun, Prosperino Davide, Räth Christoph

机构信息

Department of Physics, Ludwig-Maximilians-Universität, Schellingstraße 4, 80799, Munich, Germany.

Deutsches Zentrum für Luft- und Raumfahrt (DLR), Institut für KI Sicherheit, Wilhelm-Runge-Straße 10, 89081, Ulm, Germany.

出版信息

Sci Rep. 2023 Aug 10;13(1):12970. doi: 10.1038/s41598-023-39886-w.

DOI:10.1038/s41598-023-39886-w
PMID:37563235
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10415382/
Abstract

Reservoir computers are powerful machine learning algorithms for predicting nonlinear systems. Unlike traditional feedforward neural networks, they work on small training data sets, operate with linear optimization, and therefore require minimal computational resources. However, the traditional reservoir computer uses random matrices to define the underlying recurrent neural network and has a large number of hyperparameters that need to be optimized. Recent approaches show that randomness can be taken out by running regressions on a large library of linear and nonlinear combinations constructed from the input data and their time lags and polynomials thereof. However, for high-dimensional and nonlinear data, the number of these combinations explodes. Here, we show that a few simple changes to the traditional reservoir computer architecture further minimizing computational resources lead to significant and robust improvements in short- and long-term predictive performances compared to similar models while requiring minimal sizes of training data sets.

摘要

水库计算机是用于预测非线性系统的强大机器学习算法。与传统的前馈神经网络不同,它们在小训练数据集上运行,通过线性优化进行操作,因此所需的计算资源最少。然而,传统的水库计算机使用随机矩阵来定义底层递归神经网络,并且有大量超参数需要优化。最近的方法表明,可以通过对由输入数据及其时间滞后和多项式构成的大量线性和非线性组合库进行回归来消除随机性。然而,对于高维和非线性数据,这些组合的数量会激增。在这里,我们表明,对传统水库计算机架构进行一些简单的更改,进一步最小化计算资源,与类似模型相比,在短期和长期预测性能方面会带来显著且稳健的改进,同时所需的训练数据集规模最小。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c466/10415382/c1244f2e18df/41598_2023_39886_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c466/10415382/efcf00f79aff/41598_2023_39886_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c466/10415382/1e46c2467bd2/41598_2023_39886_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c466/10415382/c1244f2e18df/41598_2023_39886_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c466/10415382/efcf00f79aff/41598_2023_39886_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c466/10415382/1e46c2467bd2/41598_2023_39886_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/c466/10415382/c1244f2e18df/41598_2023_39886_Fig3_HTML.jpg

相似文献

1
A novel approach to minimal reservoir computing.一种用于最小化储层计算的新方法。
Sci Rep. 2023 Aug 10;13(1):12970. doi: 10.1038/s41598-023-39886-w.
2
Next generation reservoir computing.下一代存储计算。
Nat Commun. 2021 Sep 21;12(1):5564. doi: 10.1038/s41467-021-25801-2.
3
Recent advances in physical reservoir computing: A review.近期物理存储计算的进展:综述。
Neural Netw. 2019 Jul;115:100-123. doi: 10.1016/j.neunet.2019.03.005. Epub 2019 Mar 20.
4
Efficient forecasting of chaotic systems with block-diagonal and binary reservoir computing.基于块对角和二进制储层计算的混沌系统高效预测。
Chaos. 2023 Jun 1;33(6). doi: 10.1063/5.0151290.
5
Optimizing Reservoir Computers for Signal Classification.优化用于信号分类的储层计算机。
Front Physiol. 2021 Jun 18;12:685121. doi: 10.3389/fphys.2021.685121. eCollection 2021.
6
Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality.基于时滞reservoir 计算机的随机非线性时间序列预测:性能与泛化能力。
Neural Netw. 2014 Jul;55:59-71. doi: 10.1016/j.neunet.2014.03.004. Epub 2014 Mar 21.
7
Computational analysis of memory capacity in echo state networks.回声状态网络中记忆容量的计算分析。
Neural Netw. 2016 Nov;83:109-120. doi: 10.1016/j.neunet.2016.07.012. Epub 2016 Aug 16.
8
Optimizing neural networks for medical data sets: A case study on neonatal apnea prediction.优化神经网络在医学数据集上的应用:以新生儿呼吸暂停预测为例的研究
Artif Intell Med. 2019 Jul;98:59-76. doi: 10.1016/j.artmed.2019.07.008. Epub 2019 Jul 25.
9
Multiresolution Reservoir Graph Neural Network.多分辨率储层图神经网络。
IEEE Trans Neural Netw Learn Syst. 2022 Jun;33(6):2642-2653. doi: 10.1109/TNNLS.2021.3090503. Epub 2022 Jun 1.
10
Reservoir computing with higher-order interactive coupled pendulums.基于高阶交互式耦合摆的储层计算
Phys Rev E. 2023 Dec;108(6-1):064304. doi: 10.1103/PhysRevE.108.064304.

引用本文的文献

1
Modeling nonlinear oscillator networks using physics-informed hybrid reservoir computing.使用物理信息混合储层计算对非线性振荡器网络进行建模。
Sci Rep. 2025 Jul 2;15(1):22497. doi: 10.1038/s41598-025-03957-x.
2
How more data can hurt: Instability and regularization in next-generation reservoir computing.更多数据如何造成损害:下一代储层计算中的不稳定性与正则化
Chaos. 2025 Jul 1;35(7). doi: 10.1063/5.0262977.
3
Predicting three-dimensional chaotic systems with four qubit quantum systems.利用四量子比特量子系统预测三维混沌系统。

本文引用的文献

1
Robust forecasting using predictive generalized synchronization in reservoir computing.基于储层计算的预测广义同步的鲁棒预测。
Chaos. 2021 Dec;31(12):123118. doi: 10.1063/5.0066013.
2
Next generation reservoir computing.下一代存储计算。
Nat Commun. 2021 Sep 21;12(1):5564. doi: 10.1038/s41467-021-25801-2.
3
Multifunctionality in a reservoir computer.储层计算机中的多功能性。
Sci Rep. 2025 Feb 20;15(1):6201. doi: 10.1038/s41598-025-87768-0.
4
Exploring Types of Photonic Neural Networks for Imaging and Computing-A Review.用于成像和计算的光子神经网络类型探索——综述
Nanomaterials (Basel). 2024 Apr 17;14(8):697. doi: 10.3390/nano14080697.
Chaos. 2021 Jan;31(1):013125. doi: 10.1063/5.0019974.
4
On explaining the surprising success of reservoir computing forecaster of chaos? The universal machine learning dynamical system with contrast to VAR and DMD.混沌储层计算预测器成功原因解释?与 VAR 和 DMD 对比的通用机器学习动力系统。
Chaos. 2021 Jan;31(1):013108. doi: 10.1063/5.0024890.
5
Breaking symmetries of the reservoir equations in echo state networks.打破回声状态网络中储层方程的对称性。
Chaos. 2020 Dec;30(12):123142. doi: 10.1063/5.0028993.
6
Good and bad predictions: Assessing and improving the replication of chaotic attractors by means of reservoir computing.好的和坏的预测:通过储层计算评估和改进混沌吸引子的复制。
Chaos. 2019 Oct;29(10):103143. doi: 10.1063/1.5118725.
7
Testing Statistical Laws in Complex Systems.复杂系统中的统计规律检验
Phys Rev Lett. 2019 Apr 26;122(16):168301. doi: 10.1103/PhysRevLett.122.168301.
8
Recent advances in physical reservoir computing: A review.近期物理存储计算的进展:综述。
Neural Netw. 2019 Jul;115:100-123. doi: 10.1016/j.neunet.2019.03.005. Epub 2019 Mar 20.
9
Reservoir Computing Universality With Stochastic Inputs.具有随机输入的水库计算通用性
IEEE Trans Neural Netw Learn Syst. 2020 Jan;31(1):100-112. doi: 10.1109/TNNLS.2019.2899649. Epub 2019 Mar 18.
10
Scale-free networks are rare.无标度网络很罕见。
Nat Commun. 2019 Mar 4;10(1):1017. doi: 10.1038/s41467-019-08746-5.