Suppr超能文献

稳定机器学习对动力学的预测:基于新型噪声启发正则化的储层计算测试。

Stabilizing machine learning prediction of dynamics: Novel noise-inspired regularization tested with reservoir computing.

机构信息

Department of Physics, University of Maryland, 4150 Campus Dr, 20742, College Park, United States.

Hillsdale College, 33 E College St, 49242, Hillsdale, United States.

出版信息

Neural Netw. 2024 Feb;170:94-110. doi: 10.1016/j.neunet.2023.10.054. Epub 2023 Nov 7.

Abstract

Recent work has shown that machine learning (ML) models can skillfully forecast the dynamics of unknown chaotic systems. Short-term predictions of the state evolution and long-term predictions of the statistical patterns of the dynamics ("climate") can be produced by employing a feedback loop, whereby the model is trained to predict forward only one time step, then the model output is used as input for multiple time steps. In the absence of mitigating techniques, however, this feedback can result in artificially rapid error growth ("instability"). One established mitigating technique is to add noise to the ML model training input. Based on this technique, we formulate a new penalty term in the loss function for ML models with memory of past inputs that deterministically approximates the effect of many small, independent noise realizations added to the model input during training. We refer to this penalty and the resulting regularization as Linearized Multi-Noise Training (LMNT). We systematically examine the effect of LMNT, input noise, and other established regularization techniques in a case study using reservoir computing, a machine learning method using recurrent neural networks, to predict the spatiotemporal chaotic Kuramoto-Sivashinsky equation. We find that reservoir computers trained with noise or with LMNT produce climate predictions that appear to be indefinitely stable and have a climate very similar to the true system, while the short-term forecasts are substantially more accurate than those trained with other regularization techniques. Finally, we show the deterministic aspect of our LMNT regularization facilitates fast reservoir computer regularization hyperparameter tuning.

摘要

最近的工作表明,机器学习 (ML) 模型可以熟练地预测未知混沌系统的动态。通过使用反馈循环,可以生成状态演化的短期预测和动力学的统计模式的长期预测(“气候”),其中模型被训练仅向前预测一个时间步长,然后将模型输出用作多个时间步长的输入。然而,在没有缓解技术的情况下,这种反馈可能导致人为的快速误差增长(“不稳定性”)。一种已建立的缓解技术是向 ML 模型训练输入添加噪声。基于这项技术,我们在具有过去输入记忆的 ML 模型的损失函数中提出了一个新的惩罚项,该惩罚项确定性地近似了在训练期间向模型输入添加许多小的、独立噪声实现的效果。我们将这个惩罚和由此产生的正则化称为线性多噪声训练 (LMNT)。我们在使用储层计算的案例研究中系统地研究了 LMNT、输入噪声和其他已建立的正则化技术的效果,储层计算是一种使用递归神经网络的机器学习方法,用于预测时空混沌 Kuramoto-Sivashinsky 方程。我们发现,使用噪声或 LMNT 训练的储层计算机产生的气候预测似乎具有无限稳定性,并且具有与真实系统非常相似的气候,而短期预测比使用其他正则化技术训练的预测要准确得多。最后,我们展示了我们的 LMNT 正则化的确定性方面有助于快速储层计算机正则化超参数调整。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验