Suppr超能文献

化学精确的原子间神经网络势的迁移学习。

Transfer learning for chemically accurate interatomic neural network potentials.

机构信息

Faculty of Chemistry, Institute for Theoretical Chemistry, University of Stuttgart, Germany.

Faculty of Mathematics and Physics, Institute for Stochastics and Applications, University of Stuttgart, Germany.

出版信息

Phys Chem Chem Phys. 2023 Feb 15;25(7):5383-5396. doi: 10.1039/d2cp05793j.

Abstract

Developing machine learning-based interatomic potentials from electronic structure methods remains a challenging task for computational chemistry and materials science. This work studies the capability of transfer learning, in particular discriminative fine-tuning, for efficiently generating chemically accurate interatomic neural network potentials on organic molecules from the MD17 and ANI data sets. We show that pre-training the network parameters on data obtained from density functional calculations considerably improves the sample efficiency of models trained on more accurate data. Additionally, we show that fine-tuning with energy labels alone can suffice to obtain accurate atomic forces and run large-scale atomistic simulations, provided a well-designed fine-tuning data set. We also investigate possible limitations of transfer learning, especially regarding the design and size of the pre-training and fine-tuning data sets. Finally, we provide GM-NN potentials pre-trained and fine-tuned on the ANI-1x and ANI-1ccx data sets, which can easily be fine-tuned on and applied to organic molecules.

摘要

从电子结构方法中开发基于机器学习的原子间势仍然是计算化学和材料科学的一项具有挑战性的任务。本工作研究了迁移学习的能力,特别是判别式微调,用于从 MD17 和 ANI 数据集有效地生成有机分子上化学准确的原子间神经网络势。我们表明,通过在密度泛函计算中获得的数据预训练网络参数,可以显著提高在更准确数据上训练的模型的样本效率。此外,我们表明,只要有精心设计的微调数据集,仅使用能量标签进行微调就足以获得准确的原子力并进行大规模原子模拟。我们还研究了迁移学习的可能限制,特别是关于预训练和微调数据集的设计和大小。最后,我们提供了在 ANI-1x 和 ANI-1ccx 数据集上预训练和微调的 GM-NN 势,这些势可以很容易地在有机分子上进行微调并应用。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验