Panahi Shirin, Lai Ying-Cheng
School of Electrical, Computer, and Energy Engineering, Arizona State University, Tempe, Arizona 85287, USA.
Department of Physics, Arizona State University, Tempe, Arizona 85287, USA.
Chaos. 2024 May 1;34(5). doi: 10.1063/5.0200898.
A problem in nonlinear and complex dynamical systems with broad applications is forecasting the occurrence of a critical transition based solely on data without knowledge about the system equations. When such a transition leads to system collapse, as often is the case, all the available data are from the pre-critical regime where the system still functions normally, making the prediction problem challenging. In recent years, a machine-learning based approach tailored to solving this difficult prediction problem, adaptable reservoir computing, has been articulated. This Perspective introduces the basics of this machine-learning scheme and describes representative results. The general setting is that the system dynamics live on a normal attractor with oscillatory dynamics at the present time and, as a bifurcation parameter changes into the future, a critical transition can occur after which the system switches to a completely different attractor, signifying system collapse. To predict a critical transition, it is essential that the reservoir computer not only learns the dynamical "climate" of the system of interest at some specific parameter value but, more importantly, discovers how the system dynamics changes with the bifurcation parameter. It is demonstrated that this capability can be endowed into the machine through a training process with time series from a small number of distinct, pre-critical parameter values, thereby enabling accurate and reliable prediction of the catastrophic critical transition. Three applications are presented: predicting crisis, forecasting amplitude death, and creating digital twins of nonlinear dynamical systems. Limitations and future perspectives are discussed.
一个在非线性和复杂动力系统中具有广泛应用的问题是,在不了解系统方程的情况下仅根据数据预测临界转变的发生。当这种转变导致系统崩溃时(这种情况经常发生),所有可用数据都来自临界前状态,此时系统仍正常运行,这使得预测问题具有挑战性。近年来,一种专门用于解决这一困难预测问题的基于机器学习的方法——自适应储层计算,已经被提出。这篇透视文章介绍了这种机器学习方案的基础知识,并描述了代表性结果。一般情况是,系统动力学目前存在于具有振荡动力学的正常吸引子上,并且随着一个分岔参数在未来发生变化,可能会发生临界转变,之后系统切换到一个完全不同的吸引子,这意味着系统崩溃。为了预测临界转变,至关重要的是,储层计算机不仅要在某个特定参数值下学习感兴趣系统的动力学“气候”,更重要的是,要发现系统动力学如何随分岔参数变化。结果表明,通过使用来自少量不同的临界前参数值的时间序列进行训练过程,可以赋予机器这种能力,从而能够准确可靠地预测灾难性的临界转变。文章介绍了三个应用:预测危机、预测振幅死亡以及创建非线性动力系统的数字孪生。同时也讨论了局限性和未来展望。