• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于加性高斯过程回归的具有最优神经元激活函数的神经网络

Neural Network with Optimal Neuron Activation Functions Based on Additive Gaussian Process Regression.

作者信息

Manzhos Sergei, Ihara Manabu

机构信息

School of Materials and Chemical Technology, Tokyo Institute of Technology, Ookayama 2-12-1, Meguro-ku, Tokyo 152-8552, Japan.

出版信息

J Phys Chem A. 2023 Sep 21;127(37):7823-7835. doi: 10.1021/acs.jpca.3c02949. Epub 2023 Sep 12.

DOI:10.1021/acs.jpca.3c02949
PMID:37698519
Abstract

Feed-forward neural networks (NNs) are a staple machine learning method widely used in many areas of science and technology, including physical chemistry, computational chemistry, and materials informatics. While even a single-hidden-layer NN is a universal approximator, its expressive power is limited by the use of simple neuron activation functions (such as sigmoid functions) that are typically the same for all neurons. More flexible neuron activation functions would allow the use of fewer neurons and layers and thereby save computational cost and improve expressive power. We show that additive Gaussian process regression (GPR) can be used to construct optimal neuron activation functions that are individual to each neuron. An approach is also introduced that avoids nonlinear fitting of neural network parameters by defining them with rules. The resulting method combines the advantage of robustness of a linear regression with the higher expressive power of an NN. We demonstrate the approach by fitting the potential energy surfaces of the water molecule and formaldehyde. Without requiring any nonlinear optimization, the additive-GPR-based approach outperforms a conventional NN in the high-accuracy regime, where a conventional NN suffers more from overfitting.

摘要

前馈神经网络(NNs)是一种常用的机器学习方法,广泛应用于许多科学技术领域,包括物理化学、计算化学和材料信息学。虽然即使是单隐藏层神经网络也是通用逼近器,但其表达能力受到简单神经元激活函数(如 sigmoid 函数)使用的限制,这些函数通常对所有神经元都是相同的。更灵活的神经元激活函数将允许使用更少的神经元和层,从而节省计算成本并提高表达能力。我们表明,加性高斯过程回归(GPR)可用于构建每个神经元特有的最优神经元激活函数。还介绍了一种通过规则定义神经网络参数来避免对其进行非线性拟合的方法。由此产生的方法结合了线性回归的稳健性优势和神经网络的更高表达能力。我们通过拟合水分子和甲醛的势能面来演示该方法。基于加性 GPR 的方法在高精度区域优于传统神经网络,在该区域传统神经网络更容易受到过拟合的影响,且无需任何非线性优化。

相似文献

1
Neural Network with Optimal Neuron Activation Functions Based on Additive Gaussian Process Regression.基于加性高斯过程回归的具有最优神经元激活函数的神经网络
J Phys Chem A. 2023 Sep 21;127(37):7823-7835. doi: 10.1021/acs.jpca.3c02949. Epub 2023 Sep 12.
2
The deep arbitrary polynomial chaos neural network or how Deep Artificial Neural Networks could benefit from data-driven homogeneous chaos theory.深度任意多项式混沌神经网络或深度人工神经网络如何从数据驱动的均匀混沌理论中受益。
Neural Netw. 2023 Sep;166:85-104. doi: 10.1016/j.neunet.2023.06.036. Epub 2023 Jul 10.
3
Representing globally accurate reactive potential energy surfaces with complex topography by combining Gaussian process regression and neural networks.通过结合高斯过程回归和神经网络来表示具有复杂地形的全球准确反应势能面。
Phys Chem Chem Phys. 2022 Jun 1;24(21):12827-12836. doi: 10.1039/d2cp00719c.
4
Neural optimization machine: a neural network approach for optimization and its application in additive manufacturing with physics-guided learning.神经优化机器:一种用于优化的神经网络方法及其在基于物理引导学习的增材制造中的应用。
Philos Trans A Math Phys Eng Sci. 2023 Nov 13;381(2260):20220405. doi: 10.1098/rsta.2022.0405. Epub 2023 Sep 25.
5
Parametrization of analytic interatomic potential functions using neural networks.使用神经网络对解析原子间势函数进行参数化。
J Chem Phys. 2008 Jul 28;129(4):044111. doi: 10.1063/1.2957490.
6
Single-hidden-layer feed-forward quantum neural network based on Grover learning.基于 Grover 学习的单隐藏层前馈量子神经网络。
Neural Netw. 2013 Sep;45:144-50. doi: 10.1016/j.neunet.2013.02.012. Epub 2013 Mar 14.
7
The loss of the property of locality of the kernel in high-dimensional Gaussian process regression on the example of the fitting of molecular potential energy surfaces.高维高斯过程回归中核函数局域性的丧失:以分子势能面拟合为例。
J Chem Phys. 2023 Jan 28;158(4):044111. doi: 10.1063/5.0136156.
8
Computational Investigation of the Potential and Limitations of Machine Learning with Neural Network Circuits Based on Synaptic Transistors.基于突触晶体管的神经网络电路机器学习的潜力与局限性的计算研究
J Phys Chem Lett. 2024 Jul 11;15(27):6974-6985. doi: 10.1021/acs.jpclett.4c01413. Epub 2024 Jun 28.
9
SNN: Time step reduction of spiking surrogate gradients for training energy efficient single-step spiking neural networks.SNN:用于训练节能单步脉冲神经网络的脉冲替代梯度的时间步长缩减
Neural Netw. 2023 Feb;159:208-219. doi: 10.1016/j.neunet.2022.12.008. Epub 2022 Dec 19.
10
Stereopsis by constraint learning feed-forward neural networks.
IEEE Trans Neural Netw. 1993;4(2):332-42. doi: 10.1109/72.207620.

引用本文的文献

1
Deciphering the performance of different surface models for corneal topography.解读不同角膜地形图表面模型的性能
Ophthalmic Physiol Opt. 2025 Sep;45(6):1270-1281. doi: 10.1111/opo.13539. Epub 2025 Jun 19.
2
MLP Enhanced CO Emission Prediction Model with LWSSA Nature Inspired Optimization.基于LWSSA自然启发式优化的MLP增强型CO排放预测模型
Sci Rep. 2025 Jan 13;15(1):1891. doi: 10.1038/s41598-025-85709-5.
3
Computational Investigation of the Potential and Limitations of Machine Learning with Neural Network Circuits Based on Synaptic Transistors.
基于突触晶体管的神经网络电路机器学习的潜力与局限性的计算研究
J Phys Chem Lett. 2024 Jul 11;15(27):6974-6985. doi: 10.1021/acs.jpclett.4c01413. Epub 2024 Jun 28.