Suppr超能文献

使用可控性矩阵来减小回声状态网络的大小。

Reducing echo state network size with controllability matrices.

机构信息

Scripps Institution of Oceanography, University of California at San Diego, La Jolla, California 92093-0238, USA.

出版信息

Chaos. 2022 Jul;32(7):073116. doi: 10.1063/5.0071926.

Abstract

Echo state networks are a fast training variant of recurrent neural networks excelling at approximating nonlinear dynamical systems and time series prediction. These machine learning models act as nonlinear fading memory filters. While these models benefit from quick training and low complexity, computation demands from a large reservoir matrix are a bottleneck. Using control theory, a reduced size replacement reservoir matrix is found. Starting from a large, task-effective reservoir matrix, we form a controllability matrix whose rank indicates the active sub-manifold and candidate replacement reservoir size. Resulting time speed-ups and reduced memory usage come with minimal error increase to chaotic climate reconstruction or short term prediction. Experiments are performed on simple time series signals and the Lorenz-1963 and Mackey-Glass complex chaotic signals. Observing low error models shows variation of active rank and memory along a sequence of predictions.

摘要

回声状态网络是一种快速训练的递归神经网络变体,擅长于逼近非线性动力系统和时间序列预测。这些机器学习模型充当非线性的渐消记忆滤波器。虽然这些模型受益于快速训练和低复杂度,但来自大型储层矩阵的计算需求是一个瓶颈。我们使用控制理论找到了一个更小的替换储层矩阵。从一个大的、有效的储层矩阵开始,我们形成一个可控性矩阵,其秩表示活动子流形和候选替换储层大小。在混沌气候重建或短期预测方面,所得的时间加速和减少的内存使用是以最小的误差增加为代价的。实验在简单的时间序列信号和 Lorenz-1963 和 Mackey-Glass 复杂混沌信号上进行。观察低误差模型表明,随着预测序列的进行,活跃秩和记忆会发生变化。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验