Suppr超能文献

递归主成分分析

Recursive principal components analysis.

作者信息

Voegtlin Thomas

机构信息

INRIA-Campus Scientifique, B.P. 239 F-54506 Vandoeuvre-Les-Nancy Cedex, France.

出版信息

Neural Netw. 2005 Oct;18(8):1051-63. doi: 10.1016/j.neunet.2005.07.005. Epub 2005 Sep 21.

Abstract

A recurrent linear network can be trained with Oja's constrained Hebbian learning rule. As a result, the network learns to represent the temporal context associated to its input sequence. The operation performed by the network is a generalization of Principal Components Analysis (PCA) to time-series, called Recursive PCA. The representations learned by the network are adapted to the temporal statistics of the input. Moreover, sequences stored in the network may be retrieved explicitly, in the reverse order of presentation, thus providing a straight-forward neural implementation of a logical stack.

摘要

递归线性网络可以使用奥贾的约束赫布学习规则进行训练。结果,该网络学会了表示与其输入序列相关的时间上下文。网络执行的操作是主成分分析(PCA)对时间序列的推广,称为递归主成分分析。网络学习到的表示适应于输入的时间统计。此外,存储在网络中的序列可以按呈现的相反顺序明确检索,从而提供了一种逻辑堆栈的直接神经实现。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验