Cui Taoyong, Tang Chenyu, Zhou Dongzhan, Li Yuqiang, Gong Xingao, Ouyang Wanli, Su Mao, Zhang Shufei
Shanghai Artificial Intelligence Laboratory, Shanghai, China.
The Chinese University of Hong Kong, Hong Kong, China.
Nat Commun. 2025 Feb 22;16(1):1891. doi: 10.1038/s41467-025-57101-4.
Machine learning interatomic potentials (MLIPs) enable more efficient molecular dynamics (MD) simulations with ab initio accuracy, which have been used in various domains of physical science. However, distribution shift between training and test data causes deterioration of the test performance of MLIPs, and even leads to collapse of MD simulations. In this work, we propose an online Test-time Adaptation Interatomic Potential (TAIP) framework to improve the generalization on test data. Specifically, we design a dual-level self-supervised learning approach that leverages global structure and atomic local environment information to align the model with the test data. Extensive experiments demonstrate TAIP's capability to bridge the domain gap between training and test dataset without additional data. TAIP enhances the test performance on various benchmarks, from small molecule datasets to complex periodic molecular systems with various types of elements. TAIP also enables stable MD simulations where the corresponding baseline models collapse.
机器学习原子间势(MLIPs)能够实现具有从头算精度的更高效分子动力学(MD)模拟,已在物理科学的各个领域得到应用。然而,训练数据和测试数据之间的分布偏移会导致MLIPs测试性能下降,甚至导致MD模拟崩溃。在这项工作中,我们提出了一个在线测试时自适应原子间势(TAIP)框架,以提高在测试数据上的泛化能力。具体而言,我们设计了一种双层自监督学习方法,该方法利用全局结构和原子局部环境信息使模型与测试数据对齐。大量实验证明了TAIP在无需额外数据的情况下弥合训练数据集和测试数据集之间域差距的能力。TAIP提高了在各种基准测试中的测试性能,从小分子数据集到包含各种元素的复杂周期性分子系统。TAIP还能实现稳定的MD模拟,而相应的基线模型则会崩溃。