Suppr超能文献

使用无监督神经网络模型的连续在线序列学习

Continuous Online Sequence Learning with an Unsupervised Neural Network Model.

作者信息

Cui Yuwei, Ahmad Subutai, Hawkins Jeff

机构信息

Numenta, Inc. Redwood City, CA 94063, U.S.A.

出版信息

Neural Comput. 2016 Nov;28(11):2474-2504. doi: 10.1162/NECO_a_00893. Epub 2016 Sep 14.

Abstract

The ability to recognize and predict temporal sequences of sensory inputs is vital for survival in natural environments. Based on many known properties of cortical neurons, hierarchical temporal memory (HTM) sequence memory recently has been proposed as a theoretical framework for sequence learning in the cortex. In this letter, we analyze properties of HTM sequence memory and apply it to sequence learning and prediction problems with streaming data. We show the model is able to continuously learn a large number of variable order temporal sequences using an unsupervised Hebbian-like learning rule. The sparse temporal codes formed by the model can robustly handle branching temporal sequences by maintaining multiple predictions until there is sufficient disambiguating evidence. We compare the HTM sequence memory with other sequence learning algorithms, including statistical methods-autoregressive integrated moving average; feedforward neural networks-time delay neural network and online sequential extreme learning machine; and recurrent neural networks-long short-term memory and echo-state networks on sequence prediction problems with both artificial and real-world data. The HTM model achieves comparable accuracy to other state-of-the-art algorithms. The model also exhibits properties that are critical for sequence learning, including continuous online learning, the ability to handle multiple predictions and branching sequences with high-order statistics, robustness to sensor noise and fault tolerance, and good performance without task-specific hyperparameter tuning. Therefore, the HTM sequence memory not only advances our understanding of how the brain may solve the sequence learning problem but is also applicable to real-world sequence learning problems from continuous data streams.

摘要

识别和预测感官输入的时间序列的能力对于在自然环境中生存至关重要。基于皮质神经元的许多已知特性,分层时间记忆(HTM)序列记忆最近被提出作为皮质中序列学习的理论框架。在这封信中,我们分析了HTM序列记忆的特性,并将其应用于流数据的序列学习和预测问题。我们表明,该模型能够使用无监督的类赫布学习规则持续学习大量可变阶时间序列。该模型形成的稀疏时间编码可以通过维持多个预测来稳健地处理分支时间序列,直到有足够的消除歧义的证据。我们将HTM序列记忆与其他序列学习算法进行比较,包括统计方法——自回归积分移动平均;前馈神经网络——时延神经网络和在线序列极限学习机;以及循环神经网络——长短期记忆和回声状态网络,用于处理人工数据和真实世界数据的序列预测问题。HTM模型实现了与其他最先进算法相当的准确率。该模型还展现出对序列学习至关重要的特性,包括持续在线学习、处理多个预测和具有高阶统计量的分支序列的能力、对传感器噪声的鲁棒性和容错性,以及无需特定任务超参数调整就能有良好的性能。因此,HTM序列记忆不仅推进了我们对大脑如何解决序列学习问题的理解,还适用于来自连续数据流的真实世界序列学习问题。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验