Department of Electronics and Information Systems, Ghent University, 9000 Ghent, Belgium.
Neural Comput. 2012 Jan;24(1):104-33. doi: 10.1162/NECO_a_00200. Epub 2011 Aug 18.
Echo state networks (ESNs) are large, random recurrent neural networks with a single trained linear readout layer. Despite the untrained nature of the recurrent weights, they are capable of performing universal computations on temporal input data, which makes them interesting for both theoretical research and practical applications. The key to their success lies in the fact that the network computes a broad set of nonlinear, spatiotemporal mappings of the input data, on which linear regression or classification can easily be performed. One could consider the reservoir as a spatiotemporal kernel, in which the mapping to a high-dimensional space is computed explicitly. In this letter, we build on this idea and extend the concept of ESNs to infinite-sized recurrent neural networks, which can be considered recursive kernels that subsequently can be used to create recursive support vector machines. We present the theoretical framework, provide several practical examples of recursive kernels, and apply them to typical temporal tasks.
回声状态网络 (ESN) 是一种具有单个训练线性读取层的大型随机递归神经网络。尽管递归权重未经训练,但它们能够对时间输入数据进行通用计算,这使得它们在理论研究和实际应用中都很有趣。它们成功的关键在于网络对输入数据进行了广泛的非线性时空映射计算,在线性回归或分类中可以很容易地进行这些计算。可以将储层视为时空核,其中显式计算到高维空间的映射。在这封信中,我们基于这个想法将 ESN 的概念扩展到无限大小的递归神经网络,它们可以被视为递归核,随后可以用于创建递归支持向量机。我们提出了理论框架,提供了递归核的几个实际示例,并将它们应用于典型的时间任务。