School of Mathematics, South China University of Technology, Guangzhou, 510640, China.
International Research Center for Neurointelligence, The University of Tokyo, Tokyo, 113-0033, Japan.
Nat Commun. 2020 Sep 11;11(1):4568. doi: 10.1038/s41467-020-18381-0.
We develop an auto-reservoir computing framework, Auto-Reservoir Neural Network (ARNN), to efficiently and accurately make multi-step-ahead predictions based on a short-term high-dimensional time series. Different from traditional reservoir computing whose reservoir is an external dynamical system irrelevant to the target system, ARNN directly transforms the observed high-dimensional dynamics as its reservoir, which maps the high-dimensional/spatial data to the future temporal values of a target variable based on our spatiotemporal information (STI) transformation. Thus, the multi-step prediction of the target variable is achieved in an accurate and computationally efficient manner. ARNN is successfully applied to both representative models and real-world datasets, all of which show satisfactory performance in the multi-step-ahead prediction, even when the data are perturbed by noise and when the system is time-varying. Actually, such ARNN transformation equivalently expands the sample size and thus has great potential in practical applications in artificial intelligence and machine learning.
我们开发了一种自动储层计算框架,即自动储层神经网络(ARNN),以便能够根据短期高维时间序列高效、准确地进行多步预测。与传统的储层计算不同,传统的储层是与目标系统无关的外部动力系统,而 ARNN 则直接将所观察到的高维动力学作为其储层,根据我们的时空信息(STI)转换,将高维/空间数据映射到目标变量的未来时间值。因此,可以以准确且计算高效的方式实现目标变量的多步预测。ARNN 已成功应用于代表性模型和真实数据集,在多步预测中均表现出令人满意的性能,即使数据受到噪声干扰并且系统随时间变化也是如此。实际上,这种 ARNN 转换等效于扩大了样本量,因此在人工智能和机器学习的实际应用中具有很大的潜力。