Tanaka Takuma, Nakajima Kohei, Aoyagi Toshio
Graduate School of Data Science, Shiga University, 1-1-1 Banba, Hikone, Shiga 522-8522, Japan.
Graduate School of Information Science and Technology, University of Tokyo, Tokyo 113-8656, Japan.
Neurosci Res. 2020 Jul;156:225-233. doi: 10.1016/j.neures.2020.02.001. Epub 2020 Feb 14.
Reservoir computing is a framework for exploiting the inherent transient dynamics of recurrent neural networks (RNNs) as a computational resource. On the basis of this framework, much research has been conducted to evaluate the relationship between the dynamics of RNNs and the RNNs' information processing capability. In this study, we present a detailed analysis of the information processing capability of an RNN optimized by recurrent infomax (RI), an unsupervised learning method that maximizes the mutual information of RNNs by adjusting the connection weights of the network. The results indicate that RI leads to the emergence of a delay-line structure and that the network optimized by the RI possesses a superior short-term memory, which is the ability to store the temporal information of the input stream in its transient dynamics.
储层计算是一种利用递归神经网络(RNN)固有的瞬态动力学作为计算资源的框架。基于此框架,人们开展了大量研究来评估RNN的动力学与RNN信息处理能力之间的关系。在本研究中,我们对通过递归信息最大化(RI)优化的RNN的信息处理能力进行了详细分析,RI是一种无监督学习方法,通过调整网络的连接权重来最大化RNN的互信息。结果表明,RI导致了延迟线结构的出现,并且通过RI优化的网络具有卓越的短期记忆能力,即能够在其瞬态动力学中存储输入流的时间信息。