Suppr超能文献

循环神经网络中低熵编码的发展。

Development of low entropy coding in a recurrent network.

作者信息

Harpur G F, Prager R W

机构信息

Engineering Department, University of Cambridge, Trumpington Street, Cambridge, CB2 1PZ, UK.

出版信息

Network. 1996 May;7(2):277-84. doi: 10.1088/0954-898X/7/2/007.

Abstract

In this paper we present an unsupervised neural network which exhibits competition between units via inhibitory feedback. The operation is such as to minimize reconstruction error, both for individual patterns, and over the entire training set. A key difference from networks which perform principal components analysis, or one of its variants, is the ability to converge to non-orthogonal weight values. We discuss the network's operation in relation to the twin goals of maximizing information transfer and minimizing code entropy, and show how the assignment of prior probabilities to network outputs can help to reduce entropy. We present results from two binary coding problems, and from experiments with image coding.

摘要

在本文中,我们提出了一种无监督神经网络,该网络通过抑制性反馈在单元之间表现出竞争。其操作方式是使单个模式以及整个训练集的重构误差最小化。与执行主成分分析或其变体之一的网络的一个关键区别在于,它能够收敛到非正交权重值。我们讨论了该网络在最大化信息传递和最小化编码熵这两个双重目标方面的操作,并展示了如何为网络输出分配先验概率有助于降低熵。我们给出了两个二进制编码问题以及图像编码实验的结果。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验