Xie Fankai, Lu Tenglong, Meng Sheng, Liu Miao
Beijing National Laboratory for Condensed Matter Physics, Institute of Physics, Chinese Academy of Sciences, Beijing 100190, China; Songshan Lake Materials Laboratory, Dongguan 523808, China.
Beijing National Laboratory for Condensed Matter Physics, Institute of Physics, Chinese Academy of Sciences, Beijing 100190, China; Songshan Lake Materials Laboratory, Dongguan 523808, China.
Sci Bull (Beijing). 2024 Nov 30;69(22):3525-3532. doi: 10.1016/j.scib.2024.08.039. Epub 2024 Sep 1.
This study introduces a novel artificial intelligence (AI) force field, namely a graph-based pre-trained transformer force field (GPTFF), which can simulate arbitrary inorganic systems with good precision and generalizability. Harnessing a large trove of the data and the attention mechanism of transformer algorithms, the model can accurately predict energy, atomic force, and stress with mean absolute error (MAE) values of 32 meV/atom, 71 meV/Å, and 0.365 GPa, respectively. The dataset used to train the model includes 37.8 million single-point energies, 11.7 billion force pairs, and 340.2 million stresses. We also demonstrated that the GPTFF can be universally used to simulate various physical systems, such as crystal structure optimization, phase transition simulations, and mass transport. The model is publicly released with this paper, enabling anyone to use it immediately without needing to train it.
本研究引入了一种新型人工智能(AI)力场,即基于图的预训练变压器力场(GPTFF),它可以高精度且具有通用性地模拟任意无机系统。利用大量数据和变压器算法的注意力机制,该模型能够分别以32毫电子伏特/原子、71毫电子伏特/埃和0.365吉帕斯卡的平均绝对误差(MAE)值准确预测能量、原子力和应力。用于训练该模型的数据集包括3780万个单点能量、117亿对力和3.402亿个应力。我们还证明了GPTFF可普遍用于模拟各种物理系统,如晶体结构优化、相变模拟和质量传输。该模型随本文一同公开发布,任何人无需训练即可立即使用。