Suppr超能文献

在高斯双极受限玻尔兹曼机的表示效率中,隐藏偏差和隐藏熵的重要性。

On the importance of hidden bias and hidden entropy in representational efficiency of the Gaussian-Bipolar Restricted Boltzmann Machines.

机构信息

College of Engineering, Koç University, Rumelifeneri yolu, Istanbul 34450, Turkey.

出版信息

Neural Netw. 2018 Sep;105:405-418. doi: 10.1016/j.neunet.2018.06.002. Epub 2018 Jun 22.

Abstract

In this paper, we analyze the role of hidden bias in representational efficiency of the Gaussian-Bipolar Restricted Boltzmann Machines (GBPRBMs), which are similar to the widely used Gaussian-Bernoulli RBMs. Our experiments show that hidden bias plays an important role in shaping of the probability density function of the visible units. We define hidden entropy and propose it as a measure of representational efficiency of the model. By using this measure, we investigate the effect of hidden bias on the hidden entropy and provide a full analysis of the hidden entropy as function of the hidden bias for small models with up to three hidden units. We also provide an insight into understanding of the representational efficiency of the larger scale models. Furthermore, we introduce Normalized Empirical Hidden Entropy (NEHE) as an alternative to hidden entropy that can be computed for large models. Experiments on the MNIST, CIFAR-10 and Faces data sets show that NEHE can serve as measure of representational efficiency and gives an insight on minimum number of hidden units required to represent the data.

摘要

在本文中,我们分析了隐藏偏差在高斯双极限制玻尔兹曼机(GBPRBM)表示效率中的作用,GBPRBM 类似于广泛使用的高斯-伯努利 RBM。我们的实验表明,隐藏偏差在塑造可见单元的概率密度函数方面起着重要作用。我们定义了隐藏熵,并将其作为模型表示效率的度量。通过使用这个度量,我们研究了隐藏偏差对隐藏熵的影响,并对具有三个隐藏单元的小模型的隐藏熵作为隐藏偏差的函数进行了全面分析。我们还深入了解了对更大规模模型表示效率的理解。此外,我们引入了归一化经验隐藏熵(NEHE)作为替代隐藏熵的方法,可用于大型模型。在 MNIST、CIFAR-10 和 Faces 数据集上的实验表明,NEHE 可以作为表示效率的度量,并提供表示数据所需的最小隐藏单元数量的见解。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验