Ehlers Peter J, Nurdin Hendra I, Soh Daniel
Wyant College of Optical Sciences, University of Arizona, Tuscon, AZ, USA.
School of Electrical Engineering and Telecommunications, University of New South Wales, Sydney, Australia.
Neural Netw. 2025 Apr;184:107101. doi: 10.1016/j.neunet.2024.107101. Epub 2024 Dec 31.
Reservoir computing, using nonlinear dynamical systems, offers a cost-effective alternative to neural networks for complex tasks involving processing of sequential data, time series modeling, and system identification. Echo state networks (ESNs), a type of reservoir computer, mirror neural networks but simplify training. They apply fixed, random linear transformations to the internal state, followed by nonlinear changes. This process, guided by input signals and linear regression, adapts the system to match target characteristics, reducing computational demands. A potential drawback of ESNs is that the fixed reservoir may not offer the complexity needed for specific problems. While directly altering (training) the internal ESN would reintroduce the computational burden, an indirect modification can be achieved by redirecting some output as input. This feedback can influence the internal reservoir state, yielding ESNs with enhanced complexity suitable for broader challenges. In this paper, we demonstrate that by feeding some component of the reservoir state back into the network through the input, we can drastically improve upon the performance of a given ESN. We rigorously prove that, for any given ESN, feedback will almost always improve the accuracy of the output. For a set of three tasks, each representing different problem classes, we find that with feedback the average error measures are reduced by 30%-60%. Remarkably, feedback provides at least an equivalent performance boost to doubling the initial number of computational nodes, a computationally expensive and technologically challenging alternative. These results demonstrate the broad applicability and substantial usefulness of this feedback scheme.
使用非线性动力系统的储层计算,为涉及序列数据处理、时间序列建模和系统识别的复杂任务提供了一种经济高效的神经网络替代方案。回声状态网络(ESN)是一种储层计算机,它模仿神经网络,但简化了训练过程。它们对内部状态应用固定的、随机的线性变换,随后进行非线性变化。这个过程在输入信号和线性回归的引导下,使系统适应以匹配目标特征,从而降低计算需求。ESN的一个潜在缺点是固定的储层可能无法提供特定问题所需的复杂性。虽然直接改变(训练)ESN内部会重新引入计算负担,但可以通过将一些输出重定向为输入来实现间接修改。这种反馈可以影响内部储层状态,产生具有更高复杂性的ESN,适用于更广泛的挑战。在本文中,我们证明了通过将储层状态的某些组件通过输入反馈回网络,我们可以大幅提高给定ESN的性能。我们严格证明,对于任何给定的ESN,反馈几乎总是会提高输出的准确性。对于一组三个任务,每个任务代表不同的问题类别,我们发现有了反馈,平均误差度量降低了30%-60%。值得注意的是,反馈提供的性能提升至少相当于将初始计算节点数量翻倍,而这是一种计算成本高昂且技术上具有挑战性的替代方案。这些结果证明了这种反馈方案的广泛适用性和巨大实用性。