Suppr超能文献

Minimum relative entropy distributions with a large mean are Gaussian.

作者信息

Smerlak Matteo

机构信息

Perimeter Institute for Theoretical Physics, 31 Caroline Street N., Waterloo, Ontario N2L 2Y5, Canada.

出版信息

Phys Rev E. 2016 Dec;94(6-1):062107. doi: 10.1103/PhysRevE.94.062107. Epub 2016 Dec 5.

Abstract

Entropy optimization principles are versatile tools with wide-ranging applications from statistical physics to engineering to ecology. Here we consider the following constrained problem: Given a prior probability distribution q, find the posterior distribution p minimizing the relative entropy (also known as the Kullback-Leibler divergence) with respect to q under the constraint that mean(p) is fixed and large. We show that solutions to this problem are approximately Gaussian. We discuss two applications of this result. In the context of dissipative dynamics, the equilibrium distribution of a Brownian particle confined in a strong external field is independent of the shape of the confining potential. We also derive an H-type theorem for evolutionary dynamics: The entropy of the (standardized) distribution of fitness of a population evolving under natural selection is eventually increasing in time.

摘要

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验