Suppr超能文献

训练后的递归神经网络中的局部动力学

Local Dynamics in Trained Recurrent Neural Networks.

作者信息

Rivkind Alexander, Barak Omri

机构信息

Faculty of Medicine, Technion-Israel Institute of Technology, Haifa 32000, Israel.

Network Biology Research Laboratories, Technion-Israel Institute of Technology, Haifa 32000, Israel.

出版信息

Phys Rev Lett. 2017 Jun 23;118(25):258101. doi: 10.1103/PhysRevLett.118.258101.

Abstract

Learning a task induces connectivity changes in neural circuits, thereby changing their dynamics. To elucidate task-related neural dynamics, we study trained recurrent neural networks. We develop a mean field theory for reservoir computing networks trained to have multiple fixed point attractors. Our main result is that the dynamics of the network's output in the vicinity of attractors is governed by a low-order linear ordinary differential equation. The stability of the resulting equation can be assessed, predicting training success or failure. As a consequence, networks of rectified linear units and of sigmoidal nonlinearities are shown to have diametrically different properties when it comes to learning attractors. Furthermore, a characteristic time constant, which remains finite at the edge of chaos, offers an explanation of the network's output robustness in the presence of variability of the internal neural dynamics. Finally, the proposed theory predicts state-dependent frequency selectivity in the network response.

摘要

学习一项任务会在神经回路中引发连接性变化,从而改变其动力学。为了阐明与任务相关的神经动力学,我们研究经过训练的递归神经网络。我们为训练有多个定点吸引子的储层计算网络开发了一种平均场理论。我们的主要结果是,网络输出在吸引子附近的动力学由一个低阶线性常微分方程支配。可以评估所得方程的稳定性,预测训练的成功或失败。因此,在学习吸引子方面,整流线性单元网络和S型非线性网络表现出截然不同的特性。此外,一个在混沌边缘保持有限的特征时间常数,解释了在内部神经动力学存在变异性的情况下网络输出的稳健性。最后,所提出的理论预测了网络响应中与状态相关的频率选择性。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验