• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

关于神经网络的反应网络实现。

On reaction network implementations of neural networks.

作者信息

Anderson David F, Joshi Badal, Deshpande Abhishek

机构信息

Department of Mathematics, University of Wisconsin-Madison, Madison, WI, USA.

Department of Mathematics, California State University San Marcos, San Marcos, CA, USA.

出版信息

J R Soc Interface. 2021 Apr;18(177):20210031. doi: 10.1098/rsif.2021.0031. Epub 2021 Apr 14.

DOI:10.1098/rsif.2021.0031
PMID:33849332
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8086923/
Abstract

This paper is concerned with the utilization of deterministically modelled chemical reaction networks for the implementation of (feed-forward) neural networks. We develop a general mathematical framework and prove that the ordinary differential equations (ODEs) associated with certain reaction network implementations of neural networks have desirable properties including (i) existence of unique positive fixed points that are smooth in the parameters of the model (necessary for gradient descent) and (ii) fast convergence to the fixed point regardless of initial condition (necessary for efficient implementation). We do so by first making a connection between neural networks and fixed points for systems of ODEs, and then by constructing reaction networks with the correct associated set of ODEs. We demonstrate the theory by constructing a reaction network that implements a neural network with a smoothed ReLU activation function, though we also demonstrate how to generalize the construction to allow for other activation functions (each with the desirable properties listed previously). As there are multiple types of 'networks' used in this paper, we also give a careful introduction to both reaction networks and neural networks, in order to disambiguate the overlapping vocabulary in the two settings and to clearly highlight the role of each network's properties.

摘要

本文关注利用确定性建模的化学反应网络来实现(前馈)神经网络。我们开发了一个通用的数学框架,并证明与神经网络的某些反应网络实现相关的常微分方程(ODE)具有理想的性质,包括:(i)存在在模型参数中平滑的唯一正不动点(梯度下降所必需),以及(ii)无论初始条件如何都能快速收敛到不动点(高效实现所必需)。我们通过首先在神经网络和ODE系统的不动点之间建立联系,然后构建具有正确相关ODE集的反应网络来做到这一点。我们通过构建一个实现具有平滑ReLU激活函数的神经网络的反应网络来演示该理论,不过我们也展示了如何将该构建进行推广以允许使用其他激活函数(每个都具有前面列出的理想性质)。由于本文使用了多种类型的“网络”,我们还对反应网络和神经网络都进行了详细介绍,以便消除两种设置中重叠词汇的歧义,并清楚地突出每个网络性质的作用。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/f9d8a4c8a874/rsif20210031f14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/d439b057480d/rsif20210031f01.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/4f7cc1df237f/rsif20210031f02.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/e3547a75854c/rsif20210031f03.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/c713d7186aaf/rsif20210031f04.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/09506ade76e4/rsif20210031f05.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/63ecdc19fe6c/rsif20210031f06.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/2b551f45b295/rsif20210031f07.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/71ae5279391e/rsif20210031f08.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/261daa1d2234/rsif20210031f09.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/b11371710b7e/rsif20210031f10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/b0bc7b8ac29a/rsif20210031f11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/dc4ff7745267/rsif20210031f12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/093ddbfe0b70/rsif20210031f13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/f9d8a4c8a874/rsif20210031f14.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/d439b057480d/rsif20210031f01.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/4f7cc1df237f/rsif20210031f02.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/e3547a75854c/rsif20210031f03.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/c713d7186aaf/rsif20210031f04.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/09506ade76e4/rsif20210031f05.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/63ecdc19fe6c/rsif20210031f06.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/2b551f45b295/rsif20210031f07.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/71ae5279391e/rsif20210031f08.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/261daa1d2234/rsif20210031f09.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/b11371710b7e/rsif20210031f10.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/b0bc7b8ac29a/rsif20210031f11.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/dc4ff7745267/rsif20210031f12.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/093ddbfe0b70/rsif20210031f13.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/150d/8086923/f9d8a4c8a874/rsif20210031f14.jpg

相似文献

1
On reaction network implementations of neural networks.关于神经网络的反应网络实现。
J R Soc Interface. 2021 Apr;18(177):20210031. doi: 10.1098/rsif.2021.0031. Epub 2021 Apr 14.
2
Neural networks with ReLU powers need less depth.ReLU 激活函数的神经网络需要的深度更小。
Neural Netw. 2024 Apr;172:106073. doi: 10.1016/j.neunet.2023.12.027. Epub 2023 Dec 19.
3
Convergence of deep convolutional neural networks.深度卷积神经网络的融合。
Neural Netw. 2022 Sep;153:553-563. doi: 10.1016/j.neunet.2022.06.031. Epub 2022 Jun 30.
4
Optimal approximation of piecewise smooth functions using deep ReLU neural networks.使用深度 ReLU 神经网络对分段光滑函数进行最优逼近。
Neural Netw. 2018 Dec;108:296-330. doi: 10.1016/j.neunet.2018.08.019. Epub 2018 Sep 7.
5
Deep convolutional neural network and IoT technology for healthcare.用于医疗保健的深度卷积神经网络和物联网技术。
Digit Health. 2024 Jan 17;10:20552076231220123. doi: 10.1177/20552076231220123. eCollection 2024 Jan-Dec.
6
Towards understanding theoretical advantages of complex-reaction networks.理解复杂反应网络理论优势的方法。
Neural Netw. 2022 Jul;151:80-93. doi: 10.1016/j.neunet.2022.03.024. Epub 2022 Mar 29.
7
Approximation of smooth functionals using deep ReLU networks.使用深度 ReLU 网络逼近光滑泛函。
Neural Netw. 2023 Sep;166:424-436. doi: 10.1016/j.neunet.2023.07.012. Epub 2023 Jul 18.
8
Improved weight initialization for deep and narrow feedforward neural network.深度窄前馈神经网络的改进权重初始化。
Neural Netw. 2024 Aug;176:106362. doi: 10.1016/j.neunet.2024.106362. Epub 2024 May 3.
9
An exact mapping from ReLU networks to spiking neural networks.ReLU 网络到尖峰神经网络的精确映射。
Neural Netw. 2023 Nov;168:74-88. doi: 10.1016/j.neunet.2023.09.011. Epub 2023 Sep 11.
10
Stiff neural ordinary differential equations.刚性神经常微分方程。
Chaos. 2021 Sep;31(9):093122. doi: 10.1063/5.0060697.

引用本文的文献

1
Maximum likelihood estimation of log-affine models using detailed-balanced reaction networks.使用细致平衡反应网络对对数仿射模型进行最大似然估计。
J Math Biol. 2025 Sep 10;91(4):34. doi: 10.1007/s00285-025-02262-5.
2
Autonomous learning of generative models with chemical reaction network ensembles.基于化学反应网络集成的生成模型自主学习
J R Soc Interface. 2025 Jan;22(222):20240373. doi: 10.1098/rsif.2024.0373. Epub 2025 Jan 22.
3
Biology-inspired graph neural network encodes reactome and reveals biochemical reactions of disease.

本文引用的文献

1
Programming and training rate-independent chemical reaction networks.编程和训练速率无关的化学反应网络。
Proc Natl Acad Sci U S A. 2022 Jun 14;119(24):e2111552119. doi: 10.1073/pnas.2111552119. Epub 2022 Jun 9.
2
Scaling up molecular pattern recognition with DNA-based winner-take-all neural networks.基于 DNA 的胜者全拿神经网络的分子模式识别扩展。
Nature. 2018 Jul;559(7714):370-376. doi: 10.1038/s41586-018-0289-6. Epub 2018 Jul 4.
3
Feedforward Chemical Neural Network: An In Silico Chemical System That Learns xor.前馈化学神经网络:一种学习异或运算的计算机模拟化学系统。
受生物学启发的图神经网络对反应组进行编码并揭示疾病的生化反应。
Patterns (N Y). 2023 May 22;4(7):100758. doi: 10.1016/j.patter.2023.100758. eCollection 2023 Jul 14.
4
Programming and training rate-independent chemical reaction networks.编程和训练速率无关的化学反应网络。
Proc Natl Acad Sci U S A. 2022 Jun 14;119(24):e2111552119. doi: 10.1073/pnas.2111552119. Epub 2022 Jun 9.
Artif Life. 2017 Summer;23(3):295-317. doi: 10.1162/ARTL_a_00233.
4
Deterministic Function Computation with Chemical Reaction Networks.化学反应网络的确定性函数计算
Nat Comput. 2012;7433:25-42. doi: 10.1007/s11047-013-9393-6.
5
Biomolecular computing systems: principles, progress and potential.生物分子计算系统:原理、进展与潜力。
Nat Rev Genet. 2012 Jun 12;13(7):455-68. doi: 10.1038/nrg3197.
6
Neural network computation with DNA strand displacement cascades.基于 DNA 链置换级联的神经网络计算。
Nature. 2011 Jul 20;475(7356):368-72. doi: 10.1038/nature10262.
7
DNA as a universal substrate for chemical kinetics.DNA 作为化学动力学的通用基质。
Proc Natl Acad Sci U S A. 2010 Mar 23;107(12):5393-8. doi: 10.1073/pnas.0909380107. Epub 2010 Mar 4.
8
Computing algebraic functions with biochemical reaction networks.利用生化反应网络计算代数函数。
Artif Life. 2009 Winter;15(1):5-19. doi: 10.1162/artl.2009.15.1.15101.
9
On schemes of combinatorial transcription logic.关于组合转录逻辑的方案。
Proc Natl Acad Sci U S A. 2003 Apr 29;100(9):5136-41. doi: 10.1073/pnas.0930314100. Epub 2003 Apr 17.
10
Neural network model of gene expression.基因表达的神经网络模型。
FASEB J. 2001 Mar;15(3):846-54. doi: 10.1096/fj.00-0361com.