Suppr超能文献

模块化记忆增强神经网络的神经进化用于深度记忆问题。

Neuroevolution of a Modular Memory-Augmented Neural Network for Deep Memory Problems.

机构信息

Oregon State University, Corvallis, 97330, USA

Eidgenössische Technische Hochschule Zürich, Zürich, 8092, Switzerland

出版信息

Evol Comput. 2019 Winter;27(4):639-664. doi: 10.1162/evco_a_00239. Epub 2018 Nov 8.

Abstract

We present Modular Memory Units (MMUs), a new class of memory-augmented neural network. MMU builds on the gated neural architecture of Gated Recurrent Units (GRUs) and Long Short Term Memory (LSTMs), to incorporate an external memory block, similar to a Neural Turing Machine (NTM). MMU interacts with the memory block using independent read and write gates that serve to decouple the memory from the central feedforward operation. This allows for regimented memory access and update, giving our network the ability to choose when to read from memory, update it, or simply ignore it. This capacity to act in detachment allows the network to shield the memory from noise and other distractions, while simultaneously using it to effectively retain and propagate information over an extended period of time. We train MMU using both neuroevolution and gradient descent, and perform experiments on two deep memory benchmarks. Results demonstrate that MMU performs significantly faster and more accurately than traditional LSTM-based methods, and is robust to dramatic increases in the sequence depth of these memory benchmarks.

摘要

我们提出了模块化记忆单元(MMU),这是一类新的内存增强型神经网络。MMU 基于门控循环单元(GRU)和长短时记忆(LSTM)的门控神经架构,引入了一个外部内存块,类似于神经图灵机(NTM)。MMU 使用独立的读写门与内存块交互,这些门用于将内存与中央前馈操作解耦。这允许有组织地访问和更新内存,使我们的网络能够选择何时从内存中读取、更新或忽略它。这种分离操作的能力使网络能够屏蔽内存中的噪声和其他干扰,同时利用它在较长时间内有效地保留和传播信息。我们使用神经进化和梯度下降来训练 MMU,并在两个深度记忆基准上进行实验。结果表明,MMU 的性能明显优于传统基于 LSTM 的方法,并且对这些记忆基准序列深度的大幅增加具有鲁棒性。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验