Department of Information and Sciences, School of Arts and Sciences, Tokyo Woman's Christian University, 2-6-1 Zempukuji, Suginami-ku, Tokyo 167-8585, Japan.
Graduate School of Information Science and Technology, University of Tokyo, Bunkyo-ku, Tokyo 113-8656, Japan.
Phys Rev E. 2019 Dec;100(6-1):062312. doi: 10.1103/PhysRevE.100.062312.
The ability of discrete-time nonlinear recurrent neural networks to store time-varying small input signals is investigated with mean-field theory. The combination of a small input strength and mean-field assumptions makes it possible to derive an approximate expression for the conditional probability density of the state of a neuron given a past input signal. From this conditional probability density, we can analytically calculate short-term memory measures, such as memory capacity, mutual information, and Fisher information, and determine the relationships among these measures, which have not been clarified to date to the best of our knowledge. We show that the network contribution of these short-term memory measures peaks before the edge of chaos, where the dynamics of input-driven networks is stable but corresponding systems without input signals are unstable.
用平均场理论研究了离散时间非线性递归神经网络存储时变小输入信号的能力。小输入强度和平均场假设的结合使得有可能推导出给定过去输入信号时神经元状态的条件概率密度的近似表达式。从这个条件概率密度中,我们可以分析计算短期记忆度量,例如记忆容量、互信息和 Fisher 信息,并确定这些度量之间的关系,这些关系在我们所知的范围内至今尚未得到明确。我们表明,这些短期记忆度量的网络贡献在混沌边缘之前达到峰值,在输入驱动网络的动力学稳定的情况下,但没有输入信号的相应系统是不稳定的。