Suppr超能文献

神经网络重整化群。

Neural Network Renormalization Group.

机构信息

Institute of Physics, Chinese Academy of Sciences, Beijing 100190, China.

University of Chinese Academy of Sciences, Beijing 100049, China.

出版信息

Phys Rev Lett. 2018 Dec 28;121(26):260601. doi: 10.1103/PhysRevLett.121.260601.

Abstract

We present a variational renormalization group (RG) approach based on a reversible generative model with hierarchical architecture. The model performs hierarchical change-of-variables transformations from the physical space to a latent space with reduced mutual information. Conversely, the neural network directly maps independent Gaussian noises to physical configurations following the inverse RG flow. The model has an exact and tractable likelihood, which allows unbiased training and direct access to the renormalized energy function of the latent variables. To train the model, we employ probability density distillation for the bare energy function of the physical problem, in which the training loss provides a variational upper bound of the physical free energy. We demonstrate practical usage of the approach by identifying mutually independent collective variables of the Ising model and performing accelerated hybrid Monte Carlo sampling in the latent space. Lastly, we comment on the connection of the present approach to the wavelet formulation of RG and the modern pursuit of information preserving RG.

摘要

我们提出了一种基于具有层次结构的可逆生成模型的变分重整化群(RG)方法。该模型从物理空间到互信息减少的潜在空间执行分层变量变换。相反,神经网络直接根据逆 RG 流将独立的高斯噪声映射到物理构型。该模型具有精确且可处理的似然函数,允许无偏训练并直接访问潜在变量的重整化能量函数。为了训练模型,我们对物理问题的裸能量函数进行概率密度蒸馏,其中训练损失提供了物理自由能的变分上限。我们通过识别 Ising 模型的相互独立的集体变量并在潜在空间中执行加速混合蒙特卡罗采样来演示该方法的实际应用。最后,我们评论了当前方法与 RG 的小波公式和现代信息保留 RG 追求的联系。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验