Boedecker Joschka, Obst Oliver, Mayer N Michael, Asada Minoru
HFSP J. 2009 Oct;3(5):340-9. doi: 10.2976/1.3240502. Epub 2009 Oct 26.
Reservoir computing (RC) is a recent paradigm in the field of recurrent neural networks. Networks in RC have a sparsely and randomly connected fixed hidden layer, and only output connections are trained. RC networks have recently received increased attention as a mathematical model for generic neural microcircuits to investigate and explain computations in neocortical columns. Applied to specific tasks, their fixed random connectivity, however, leads to significant variation in performance. Few problem-specific optimization procedures are known, which would be important for engineering applications, but also in order to understand how networks in biology are shaped to be optimally adapted to requirements of their environment. We study a general network initialization method using permutation matrices and derive a new unsupervised learning rule based on intrinsic plasticity (IP). The IP-based learning uses only local learning, and its aim is to improve network performance in a self-organized way. Using three different benchmarks, we show that networks with permutation matrices for the reservoir connectivity have much more persistent memory than the other methods but are also able to perform highly nonlinear mappings. We also show that IP-based on sigmoid transfer functions is limited concerning the output distributions that can be achieved.
储层计算(RC)是循环神经网络领域的一种新范式。RC中的网络具有稀疏且随机连接的固定隐藏层,并且仅对输出连接进行训练。最近,RC网络作为一种用于研究和解释新皮质柱中计算的通用神经微电路数学模型受到了越来越多的关注。然而,应用于特定任务时,其固定的随机连接性会导致性能出现显著差异。已知很少有针对特定问题的优化程序,这对于工程应用很重要,而且对于理解生物中的网络如何被塑造以最佳地适应其环境的要求也很重要。我们研究了一种使用置换矩阵的通用网络初始化方法,并基于内在可塑性(IP)推导了一种新的无监督学习规则。基于IP的学习仅使用局部学习,其目的是以自组织的方式提高网络性能。使用三个不同的基准,我们表明具有用于储层连接性的置换矩阵的网络比其他方法具有更持久的记忆,但也能够执行高度非线性映射。我们还表明,基于Sigmoid传递函数的IP在可实现的输出分布方面存在局限性。