Wang Hai, Long Xingyi, Liu Xue-Xin
IEEE Trans Neural Netw Learn Syst. 2023 Dec;34(12):10487-10501. doi: 10.1109/TNNLS.2022.3167466. Epub 2023 Nov 30.
Echo state networks (ESNs) are reservoir computing-based recurrent neural networks widely used in pattern analysis and machine intelligence applications. In order to achieve high accuracy with large model capacity, ESNs usually contain a large-sized internal layer (reservoir), making the evaluation process too slow for some applications. In this work, we speed up the evaluation of ESN by building a reduced network called the fast ESN (fastESN) and achieve an ESN evaluation complexity independent of the original ESN size for the first time. FastESN is generated using three techniques. First, the high-dimensional state of the original ESN is approximated by a low-dimensional state through proper orthogonal decomposition (POD)-based projection. Second, the activation function evaluation number is reduced through the discrete empirical interpolation method (DEIM). Third, we show the directly generated fastESN has instability problems and provide a stabilization scheme as a solution. Through experiments on four popular benchmarks, we show that fastESN is able to accelerate the sparse storage-based ESN evaluation with a high parameter compression ratio and a fast evaluation speed.
回声状态网络(ESN)是基于储层计算的递归神经网络,广泛应用于模式分析和机器智能应用中。为了在大模型容量下实现高精度,ESN通常包含一个大型的内部层(储层),这使得评估过程对于某些应用来说过于缓慢。在这项工作中,我们通过构建一个名为快速ESN(fastESN)的简化网络来加速ESN的评估,并首次实现了与原始ESN大小无关的ESN评估复杂度。FastESN是使用三种技术生成的。首先,通过基于适当正交分解(POD)的投影,将原始ESN的高维状态近似为低维状态。其次,通过离散经验插值方法(DEIM)减少激活函数的评估次数。第三,我们表明直接生成的fastESN存在稳定性问题,并提供了一种稳定方案作为解决方案。通过在四个流行基准上的实验,我们表明fastESN能够以高参数压缩率和快速评估速度加速基于稀疏存储的ESN评估。