Suppr超能文献

受限玻尔兹曼机的确定性与非确定性优化算法

Deterministic Versus Nondeterministic Optimization Algorithms for the Restricted Boltzmann Machine.

作者信息

Zeng Gengsheng L

机构信息

Department of Computer Science, Utah Valley University, USA.

出版信息

J Comput Cogn Eng. 2024 Nov 22;3(4):404-411. doi: 10.47852/bonviewjcce42022789. Epub 2024 May 23.

Abstract

A restricted Boltzmann machine is a fully connected shallow neural network. It can be used to solve many challenging optimization problems. The Boltzmann machines are usually considered probability models. Probability models normally use nondeterministic algorithms to solve their parameters. The Hopfield network which is also known as the Ising model is a special case of a Boltzmann machine, in the sense that the hidden layer is the same as the visible layer. The weights and biases from the visible layer to the hidden layer are the same as the weights and biases from the hidden layer to the visible layer. When the Hopfield network is considered a probabilistic model, everything is treated as stochastic (i.e., random) and nondeterministic. An optimization problem in the Hopfield network is considered searching for the samples that have higher probabilities according to a probability density function. This paper proposes a method to consider the Hopfield network as a deterministic model, in which nothing is random, and no stochastic distribution is used. An optimization problem associated with the Hopfield network thus has a deterministic objective function (also known as loss function or cost function) that is the energy function itself. The purpose of the objective function is to assist the Hopfield network to reach a state that has a lower energy. This study suggests that deterministic optimization algorithms can be used for the associated optimization problems. The deterministic algorithm has the same mathematical form for the calculation of a perceptron that consists of a dot product, a bias, and a nonlinear activation function. This paper uses some examples of searching for stable states to demonstrate that the deterministic optimization method may have a faster convergence rate and smaller errors.

摘要

受限玻尔兹曼机是一种全连接的浅层神经网络。它可用于解决许多具有挑战性的优化问题。玻尔兹曼机通常被视为概率模型。概率模型通常使用非确定性算法来求解其参数。霍普菲尔德网络(也称为伊辛模型)是玻尔兹曼机的一种特殊情况,即隐藏层与可见层相同。从可见层到隐藏层的权重和偏差与从隐藏层到可见层的权重和偏差相同。当将霍普菲尔德网络视为概率模型时,一切都被视为随机(即不确定)且非确定性的。霍普菲尔德网络中的优化问题被认为是根据概率密度函数寻找具有更高概率的样本。本文提出了一种将霍普菲尔德网络视为确定性模型的方法,其中没有任何东西是随机的,也不使用随机分布。因此,与霍普菲尔德网络相关的优化问题有一个确定性目标函数(也称为损失函数或成本函数),即能量函数本身。目标函数的目的是帮助霍普菲尔德网络达到能量较低的状态。本研究表明,确定性优化算法可用于相关的优化问题。确定性算法对于由点积、偏差和非线性激活函数组成的感知机的计算具有相同的数学形式。本文使用一些寻找稳定状态的例子来证明确定性优化方法可能具有更快的收敛速度和更小的误差。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/e79d/11634054/d74bb7aa4eeb/nihms-2021810-f0001.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验