Hart Joseph D
U.S. Naval Research Laboratory, Code 5675, Washington, DC 20375, USA.
Chaos. 2024 Apr 1;34(4). doi: 10.1063/5.0196257.
Reservoir computing is a machine learning framework that has been shown to be able to replicate the chaotic attractor, including the fractal dimension and the entire Lyapunov spectrum, of the dynamical system on which it is trained. We quantitatively relate the generalized synchronization dynamics of a driven reservoir during the training stage to the performance of the trained reservoir computer at the attractor reconstruction task. We show that, in order to obtain successful attractor reconstruction and Lyapunov spectrum estimation, the maximal conditional Lyapunov exponent of the driven reservoir must be significantly more negative than the most negative Lyapunov exponent of the target system. We also find that the maximal conditional Lyapunov exponent of the reservoir depends strongly on the spectral radius of the reservoir adjacency matrix; therefore, for attractor reconstruction and Lyapunov spectrum estimation, small spectral radius reservoir computers perform better in general. Our arguments are supported by numerical examples on well-known chaotic systems.
储层计算是一种机器学习框架,已被证明能够复制其训练所基于的动力系统的混沌吸引子,包括分形维数和整个李雅普诺夫谱。我们定量地将训练阶段受驱动储层的广义同步动力学与训练后的储层计算机在吸引子重构任务中的性能联系起来。我们表明,为了成功进行吸引子重构和李雅普诺夫谱估计,受驱动储层的最大条件李雅普诺夫指数必须比目标系统最负的李雅普诺夫指数明显更负。我们还发现,储层的最大条件李雅普诺夫指数强烈依赖于储层邻接矩阵的谱半径;因此,对于吸引子重构和李雅普诺夫谱估计,一般来说,谱半径小的储层计算机表现更好。我们的观点得到了关于著名混沌系统的数值例子的支持。