Suppr超能文献

时间核递归神经网络。

Temporal-kernel recurrent neural networks.

机构信息

Department of Computer Science, University of Toronto, Toronto, Canada.

出版信息

Neural Netw. 2010 Mar;23(2):239-43. doi: 10.1016/j.neunet.2009.10.009. Epub 2009 Nov 5.

Abstract

A Recurrent Neural Network (RNN) is a powerful connectionist model that can be applied to many challenging sequential problems, including problems that naturally arise in language and speech. However, RNNs are extremely hard to train on problems that have long-term dependencies, where it is necessary to remember events for many timesteps before using them to make a prediction. In this paper we consider the problem of training RNNs to predict sequences that exhibit significant long-term dependencies, focusing on a serial recall task where the RNN needs to remember a sequence of characters for a large number of steps before reconstructing it. We introduce the Temporal-Kernel Recurrent Neural Network (TKRNN), which is a variant of the RNN that can cope with long-term dependencies much more easily than a standard RNN, and show that the TKRNN develops short-term memory that successfully solves the serial recall task by representing the input string with a stable state of its hidden units.

摘要

递归神经网络(RNN)是一种强大的连接主义模型,可以应用于许多具有挑战性的序列问题,包括语言和语音中自然出现的问题。然而,对于具有长期依赖关系的问题,RNN 非常难以训练,在这种问题中,需要在使用它们进行预测之前,记住多个时间步长的事件。在本文中,我们考虑了训练 RNN 以预测具有显著长期依赖关系的序列的问题,重点关注串行回忆任务,其中 RNN 需要在重建之前记住字符序列的大量步骤。我们引入了时间核递归神经网络(TKRNN),这是 RNN 的一种变体,与标准 RNN 相比,它可以更容易地处理长期依赖关系,并表明 TKRNN 通过其隐藏单元的稳定状态表示输入字符串,发展出短期记忆,成功解决了串行回忆任务。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验