Omee Sadman Sadeed, Louis Steph-Yves, Fu Nihang, Wei Lai, Dey Sourin, Dong Rongzhi, Li Qinyang, Hu Jianjun
Department of Computer Science and Engineering, University of South Carolina, Columbia, SC 29201, USA.
Patterns (N Y). 2022 Apr 27;3(5):100491. doi: 10.1016/j.patter.2022.100491. eCollection 2022 May 13.
Machine-learning-based materials property prediction models have emerged as a promising approach for new materials discovery, among which the graph neural networks (GNNs) have shown the best performance due to their capability to learn high-level features from crystal structures. However, existing GNN models suffer from their lack of scalability, high hyperparameter tuning complexity, and constrained performance due to over-smoothing. We propose a scalable global graph attention neural network model DeeperGATGNN with differentiable group normalization (DGN) and skip connections for high-performance materials property prediction. Our systematic benchmark studies show that our model achieves the state-of-the-art prediction results on five out of six datasets, outperforming five existing GNN models by up to 10%. Our model is also the most scalable one in terms of graph convolution layers, which allows us to train very deep networks (e.g., >30 layers) without significant performance degradation. Our implementation is available at https://github.com/usccolumbia/deeperGATGNN.
基于机器学习的材料性能预测模型已成为发现新材料的一种有前途的方法,其中图神经网络(GNN)由于能够从晶体结构中学习高级特征而表现出最佳性能。然而,现有的GNN模型存在缺乏可扩展性、超参数调整复杂度高以及由于过度平滑导致性能受限等问题。我们提出了一种可扩展的全局图注意力神经网络模型DeeperGATGNN,它具有可微组归一化(DGN)和跳跃连接,用于高性能材料性能预测。我们的系统基准研究表明,我们的模型在六个数据集中的五个上取得了领先的预测结果,比五个现有的GNN模型性能提升高达10%。在图卷积层方面,我们的模型也是最具可扩展性的,这使我们能够训练非常深的网络(例如,>30层)而不会出现显著的性能下降。我们的实现可在https://github.com/usccolumbia/deeperGATGNN上获取。