• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

轻量级且高效的张量灵敏度算法用于原子神经网络。

Lightweight and effective tensor sensitivity for atomistic neural networks.

机构信息

Theoretical Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545, USA.

Center for Nonlinear Studies, Los Alamos National Laboratory, Los Alamos, New Mexico 87545, USA.

出版信息

J Chem Phys. 2023 May 14;158(18). doi: 10.1063/5.0142127.

DOI:10.1063/5.0142127
PMID:37158328
Abstract

Atomistic machine learning focuses on the creation of models that obey fundamental symmetries of atomistic configurations, such as permutation, translation, and rotation invariances. In many of these schemes, translation and rotation invariance are achieved by building on scalar invariants, e.g., distances between atom pairs. There is growing interest in molecular representations that work internally with higher rank rotational tensors, e.g., vector displacements between atoms, and tensor products thereof. Here, we present a framework for extending the Hierarchically Interacting Particle Neural Network (HIP-NN) with Tensor Sensitivity information (HIP-NN-TS) from each local atomic environment. Crucially, the method employs a weight tying strategy that allows direct incorporation of many-body information while adding very few model parameters. We show that HIP-NN-TS is more accurate than HIP-NN, with negligible increase in parameter count, for several datasets and network sizes. As the dataset becomes more complex, tensor sensitivities provide greater improvements to model accuracy. In particular, HIP-NN-TS achieves a record mean absolute error of 0.927 kcalmol for conformational energy variation on the challenging COMP6 benchmark, which includes a broad set of organic molecules. We also compare the computational performance of HIP-NN-TS to HIP-NN and other models in the literature.

摘要

原子级机器学习专注于创建遵守原子构型基本对称性的模型,如置换、平移和旋转不变性。在这些方案中,许多方案通过构建标量不变量(例如原子对之间的距离)来实现平移和旋转不变性。人们越来越关注在内部使用高阶旋转张量(例如原子之间的向量位移及其张量积)的分子表示。在这里,我们提出了一种从每个局部原子环境扩展分层相互作用粒子神经网络 (HIP-NN) 的框架,即张量灵敏度信息 (HIP-NN-TS)。至关重要的是,该方法采用权重绑定策略,允许在添加很少模型参数的情况下直接纳入多体信息。我们表明,对于几个数据集和网络大小,HIP-NN-TS 比 HIP-NN 更准确,而参数数量几乎没有增加。随着数据集变得更加复杂,张量灵敏度为模型准确性提供了更大的改进。特别是,HIP-NN-TS 在具有挑战性的 COMP6 基准上实现了构象能变分的记录平均绝对误差为 0.927 kcalmol,其中包括广泛的有机分子。我们还将 HIP-NN-TS 的计算性能与 HIP-NN 和文献中的其他模型进行了比较。

相似文献

1
Lightweight and effective tensor sensitivity for atomistic neural networks.轻量级且高效的张量灵敏度算法用于原子神经网络。
J Chem Phys. 2023 May 14;158(18). doi: 10.1063/5.0142127.
2
Transferable Dynamic Molecular Charge Assignment Using Deep Neural Networks.基于深度神经网络的可转移动态分子电荷分配
J Chem Theory Comput. 2018 Sep 11;14(9):4687-4698. doi: 10.1021/acs.jctc.8b00524. Epub 2018 Aug 17.
3
Hierarchical modeling of molecular energies using a deep neural network.利用深度神经网络对分子能量进行层次建模。
J Chem Phys. 2018 Jun 28;148(24):241715. doi: 10.1063/1.5011181.
4
Fitting potential energy surfaces with fundamental invariant neural network. II. Generating fundamental invariants for molecular systems with up to ten atoms.用基本不变神经网络拟合势能面。II. 为至多十个原子的分子系统生成基本不变量。
J Chem Phys. 2020 May 29;152(20):204307. doi: 10.1063/5.0010104.
5
Tensor-Factorized Neural Networks.张量分解神经网络。
IEEE Trans Neural Netw Learn Syst. 2018 May;29(5):1998-2011. doi: 10.1109/TNNLS.2017.2690379. Epub 2017 Apr 17.
6
Symmetry-Adapted Machine Learning for Tensorial Properties of Atomistic Systems.用于原子系统张量性质的对称自适应机器学习。
Phys Rev Lett. 2018 Jan 19;120(3):036002. doi: 10.1103/PhysRevLett.120.036002.
7
Communication: Fitting potential energy surfaces with fundamental invariant neural network.通讯:用基本不变神经网络拟合势能面
J Chem Phys. 2016 Aug 21;145(7):071101. doi: 10.1063/1.4961454.
8
Machine Learning Models of Survival Prediction in Trauma Patients.创伤患者生存预测的机器学习模型
J Clin Med. 2019 Jun 5;8(6):799. doi: 10.3390/jcm8060799.
9
SchNetPack: A Deep Learning Toolbox For Atomistic Systems.SchNetPack:用于原子系统的深度学习工具箱。
J Chem Theory Comput. 2019 Jan 8;15(1):448-455. doi: 10.1021/acs.jctc.8b00908. Epub 2018 Dec 10.
10
A critical comparison of neural network potentials for molecular reaction dynamics with exact permutation symmetry.具有精确置换对称性的分子反应动力学神经网络势的关键比较。
Phys Chem Chem Phys. 2019 May 15;21(19):9672-9682. doi: 10.1039/c8cp06919k.

引用本文的文献

1
Including Physics-Informed Atomization Constraints in Neural Networks for Reactive Chemistry.在用于反应化学的神经网络中纳入物理知识雾化约束条件。
J Chem Inf Model. 2025 May 12;65(9):4367-4380. doi: 10.1021/acs.jcim.5c00341. Epub 2025 Apr 29.
2
MLTB: Enhancing Transferability and Extensibility of Density Functional Tight-Binding Theory with Many-body Interaction Corrections.MLTB:通过多体相互作用校正增强密度泛函紧束缚理论的可转移性和可扩展性。
J Chem Theory Comput. 2025 Feb 11;21(3):1089-1097. doi: 10.1021/acs.jctc.4c00858. Epub 2025 Jan 28.
3
Thermodynamic Transferability in Coarse-Grained Force Fields Using Graph Neural Networks.
使用图神经网络的粗粒度力场中的热力学可转移性。
J Chem Theory Comput. 2024 Dec 10;20(23):10524-10539. doi: 10.1021/acs.jctc.4c00788. Epub 2024 Nov 23.
4
Data Generation for Machine Learning Interatomic Potentials and Beyond.用于机器学习原子间势及其他方面的数据生成。
Chem Rev. 2024 Dec 25;124(24):13681-13714. doi: 10.1021/acs.chemrev.4c00572. Epub 2024 Nov 21.
5
Exploring the frontiers of condensed-phase chemistry with a general reactive machine learning potential.利用通用反应性机器学习势能探索凝聚相化学的前沿领域。
Nat Chem. 2024 May;16(5):727-734. doi: 10.1038/s41557-023-01427-3. Epub 2024 Mar 7.