Suppr超能文献

单调伪布尔优化与随机神经计算中的适应性

Adaptiveness in monotone pseudo-Boolean optimization and stochastic neural computation.

作者信息

Grossi Giuliano

机构信息

Dipartimento di Scienze dell'Informazione, Università degli Studi di Milano, Via Comelico 39, Milano I-20135, Italy.

出版信息

Int J Neural Syst. 2009 Aug;19(4):241-52. doi: 10.1142/S0129065709001999.

Abstract

Hopfield neural network (HNN) is a nonlinear computational model successfully applied in finding near-optimal solutions of several difficult combinatorial problems. In many cases, the network energy function is obtained through a learning procedure so that its minima are states falling into a proper subspace (feasible region) of the search space. However, because of the network nonlinearity, a number of undesirable local energy minima emerge from the learning procedure, significantly effecting the network performance. In the neural model analyzed here, we combine both a penalty and a stochastic process in order to enhance the performance of a binary HNN. The penalty strategy allows us to gradually lead the search towards states representing feasible solutions, so avoiding oscillatory behaviors or asymptotically instable convergence. Presence of stochastic dynamics potentially prevents the network to fall into shallow local minima of the energy function, i.e., quite far from global optimum. Hence, for a given fixed network topology, the desired final distribution on the states can be reached by carefully modulating such process. The model uses pseudo-Boolean functions both to express problem constraints and cost function; a combination of these two functions is then interpreted as energy of the neural network. A wide variety of NP-hard problems fall in the class of problems that can be solved by the model at hand, particularly those having a monotonic quadratic pseudo-Boolean function as constraint function. That is, functions easily derived by closed algebraic expressions representing the constraint structure and easy (polynomial time) to maximize. We show the asymptotic convergence properties of this model characterizing its state space distribution at thermal equilibrium in terms of Markov chain and give evidence of its ability to find high quality solutions on benchmarks and randomly generated instances of two specific problems taken from the computational graph theory.

摘要

霍普菲尔德神经网络(HNN)是一种非线性计算模型,已成功应用于寻找几个困难组合问题的近似最优解。在许多情况下,网络能量函数是通过学习过程获得的,以便其最小值是落入搜索空间适当子空间(可行区域)的状态。然而,由于网络的非线性,学习过程中会出现许多不良的局部能量最小值,显著影响网络性能。在此分析的神经模型中,我们结合了惩罚和随机过程,以提高二元霍普菲尔德神经网络的性能。惩罚策略使我们能够逐步引导搜索朝着代表可行解的状态进行,从而避免振荡行为或渐近不稳定收敛。随机动力学的存在可能会防止网络陷入能量函数的浅局部最小值,即离全局最优值很远的地方。因此,对于给定的固定网络拓扑,可以通过仔细调节这样的过程来达到状态上期望的最终分布。该模型使用伪布尔函数来表达问题约束和成本函数;然后将这两个函数的组合解释为神经网络的能量。各种各样的NP难问题都属于可以用手头模型解决的问题类别,特别是那些具有单调二次伪布尔函数作为约束函数的问题。也就是说,这些函数很容易通过表示约束结构的封闭代数表达式导出,并且很容易(多项式时间)最大化。我们展示了该模型的渐近收敛特性,用马尔可夫链来表征其在热平衡时的状态空间分布,并证明了它在取自计算图论的两个特定问题的基准测试和随机生成实例上找到高质量解的能力。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验