Prater Ashley
Air Force Research Laboratory, Information Directorate, Rome NY, USA.
Neural Netw. 2017 Jul;91:66-75. doi: 10.1016/j.neunet.2017.04.008. Epub 2017 Apr 24.
Reservoir computing is a recently introduced machine learning paradigm that has been shown to be well-suited for the processing of spatiotemporal data. Rather than training the network node connections and weights via backpropagation in traditional recurrent neural networks, reservoirs instead have fixed connections and weights among the 'hidden layer' nodes, and traditionally only the weights to the output layer of neurons are trained using linear regression. We claim that for signal classification tasks one may forgo the weight training step entirely and instead use a simple supervised clustering method based upon principal components of reservoir states. The proposed method is mathematically analyzed and explored through numerical experiments on real-world data. The examples demonstrate that the proposed may outperform the traditional trained output weight approach in terms of classification accuracy.
储层计算是一种最近引入的机器学习范式,已被证明非常适合处理时空数据。与传统递归神经网络通过反向传播训练网络节点连接和权重不同,储层在“隐藏层”节点之间具有固定的连接和权重,传统上仅使用线性回归训练神经元输出层的权重。我们声称,对于信号分类任务,人们可以完全放弃权重训练步骤,而是使用基于储层状态主成分的简单监督聚类方法。通过对真实世界数据的数值实验对所提出的方法进行了数学分析和探索。示例表明,所提出的方法在分类准确性方面可能优于传统的训练输出权重方法。