Yang Qi, Jiang Gui-Duo, He Sheng-Gui
State Key Laboratory for Structural Chemistry of Unstable and Stable Species, Institute of Chemistry, Chinese Academy of Sciences, Beijing 100190, PR China.
University of Chinese Academy of Sciences, Beijing 100049, PR China.
J Chem Theory Comput. 2023 Mar 28;19(6):1922-1930. doi: 10.1021/acs.jctc.2c00923. Epub 2023 Mar 14.
The global optimization of metal cluster structures is an important research field. The traditional deep neural network (T-DNN) global optimization method is a good way to find out the global minimum (GM) of metal cluster structures, but a large number of samples are required. We developed a new global optimization method which is the combination of the DNN and transfer learning (DNN-TL). The DNN-TL method transfers the DNN parameters of the small-sized cluster to the DNN of the large-sized cluster to greatly reduce the number of samples. For the global optimization of Pt and Pt clusters in this research, the T-DNN method requires about 3-10 times more samples than the DNN-TL method, and the DNN-TL method saves about 70-80% of time. We also found that the average amplitude of parameter changes in the T-DNN training is about 2 times larger than that in the DNN-TL training, which rationalizes the effectiveness of transfer learning. The average fitting errors of the DNN trained by the DNN-TL method can be even smaller than those by the T-DNN method because of the reliability of transfer learning. Finally, we successfully obtained the GM structures of Pt ( = 8-14) clusters by the DNN-TL method.
金属团簇结构的全局优化是一个重要的研究领域。传统深度神经网络(T-DNN)全局优化方法是找出金属团簇结构全局最小值(GM)的一种好方法,但需要大量样本。我们开发了一种新的全局优化方法,即深度神经网络与迁移学习相结合的方法(DNN-TL)。DNN-TL方法将小尺寸团簇的DNN参数迁移到大尺寸团簇的DNN中,从而大大减少样本数量。对于本研究中Pt及Pt团簇的全局优化,T-DNN方法所需样本数量比DNN-TL方法多约3至10倍,且DNN-TL方法节省约70%-80%的时间。我们还发现,T-DNN训练中参数变化的平均幅度比DNN-TL训练中的约大2倍,这说明了迁移学习的有效性。由于迁移学习的可靠性,用DNN-TL方法训练的DNN的平均拟合误差甚至可能比用T-DNN方法训练的更小。最后,我们通过DNN-TL方法成功获得了Pt(n = 8-14)团簇的GM结构。