Voegtlin Thomas
INRIA-Campus Scientifique, B.P. 239 F-54506 Vandoeuvre-Les-Nancy Cedex, France.
Neural Netw. 2005 Oct;18(8):1051-63. doi: 10.1016/j.neunet.2005.07.005. Epub 2005 Sep 21.
A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. The representations learned by the network are adapted to the temporal statistics of the input. Moreover, sequences stored in the network may be retrieved explicitly, in the reverse order of presentation, thus providing a straight-forward neural implementation of a logical stack.
递归线性网络可以使用奥贾的约束赫布学习规则进行训练。结果,该网络学会了表示与其输入序列相关的时间上下文。网络执行的操作是主成分分析(PCA)对时间序列的推广,称为递归主成分分析。网络学习到的表示适应于输入的时间统计。此外,存储在网络中的序列可以按呈现的相反顺序明确检索,从而提供了一种逻辑堆栈的直接神经实现。