• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于树结构的学习优于卷积前馈网络。

Learning on tree architectures outperforms a convolutional feedforward network.

机构信息

Department of Physics, Bar-Ilan University, 52900, Ramat-Gan, Israel.

Gonda Interdisciplinary Brain Research Center, Bar-Ilan University, 52900, Ramat-Gan, Israel.

出版信息

Sci Rep. 2023 Jan 30;13(1):962. doi: 10.1038/s41598-023-27986-6.

DOI:10.1038/s41598-023-27986-6
PMID:36717568
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9886946/
Abstract

Advanced deep learning architectures consist of tens of fully connected and convolutional hidden layers, currently extended to hundreds, are far from their biological realization. Their implausible biological dynamics relies on changing a weight in a non-local manner, as the number of routes between an output unit and a weight is typically large, using the backpropagation technique. Here, a 3-layer tree architecture inspired by experimental-based dendritic tree adaptations is developed and applied to the offline and online learning of the CIFAR-10 database. The proposed architecture outperforms the achievable success rates of the 5-layer convolutional LeNet. Moreover, the highly pruned tree backpropagation approach of the proposed architecture, where a single route connects an output unit and a weight, represents an efficient dendritic deep learning.

摘要

先进的深度学习架构由数十个全连接和卷积隐藏层组成,目前已扩展到数百个,远非生物实现。它们不合理的生物动力学依赖于以非局部方式改变权重,因为输出单元和权重之间的路径数量通常很大,使用反向传播技术。在这里,开发了一种受基于实验的树突树适应启发的 3 层树架构,并将其应用于 CIFAR-10 数据库的离线和在线学习。所提出的架构优于 5 层卷积 LeNet 的可实现成功率。此外,所提出的架构的高度修剪的树反向传播方法,其中单个路径连接输出单元和权重,代表了有效的树突深度学习。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/45a7/9886946/ae96d91aedef/41598_2023_27986_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/45a7/9886946/7ca23ac135bf/41598_2023_27986_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/45a7/9886946/ae96d91aedef/41598_2023_27986_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/45a7/9886946/7ca23ac135bf/41598_2023_27986_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/45a7/9886946/ae96d91aedef/41598_2023_27986_Fig2_HTML.jpg

相似文献

1
Learning on tree architectures outperforms a convolutional feedforward network.基于树结构的学习优于卷积前馈网络。
Sci Rep. 2023 Jan 30;13(1):962. doi: 10.1038/s41598-023-27986-6.
2
Efficient shallow learning as an alternative to deep learning.高效浅层学习作为深度学习的替代方法。
Sci Rep. 2023 Apr 20;13(1):5423. doi: 10.1038/s41598-023-32559-8.
3
Enhancing the accuracies by performing pooling decisions adjacent to the output layer.通过在输出层附近执行池化决策来提高准确率。
Sci Rep. 2023 Aug 31;13(1):13385. doi: 10.1038/s41598-023-40566-y.
4
Forward layer-wise learning of convolutional neural networks through separation index maximizing.通过分离指数最大化进行卷积神经网络的逐层前向学习。
Sci Rep. 2024 Apr 13;14(1):8576. doi: 10.1038/s41598-024-59176-3.
5
Biologically Plausible Training Mechanisms for Self-Supervised Learning in Deep Networks.深度网络中自监督学习的生物学合理训练机制
Front Comput Neurosci. 2022 Mar 21;16:789253. doi: 10.3389/fncom.2022.789253. eCollection 2022.
6
Biologically plausible deep learning - But how far can we go with shallow networks?生物学上合理的深度学习——但我们可以在浅层网络中走多远?
Neural Netw. 2019 Oct;118:90-101. doi: 10.1016/j.neunet.2019.06.001. Epub 2019 Jun 20.
7
hidden layer artificial neural network architecture computer code: geophysical application example.隐藏层人工神经网络架构计算机代码:地球物理应用示例。
Heliyon. 2020 Jun 11;6(6):e04108. doi: 10.1016/j.heliyon.2020.e04108. eCollection 2020 Jun.
8
Learning hidden patterns from patient multivariate time series data using convolutional neural networks: A case study of healthcare cost prediction.使用卷积神经网络从患者多变量时间序列数据中学习隐藏模式:以医疗保健成本预测为例。
J Biomed Inform. 2020 Nov;111:103565. doi: 10.1016/j.jbi.2020.103565. Epub 2020 Sep 25.
9
Fast learning method for convolutional neural networks using extreme learning machine and its application to lane detection.基于极端学习机的卷积神经网络快速学习方法及其在车道检测中的应用。
Neural Netw. 2017 Mar;87:109-121. doi: 10.1016/j.neunet.2016.12.002. Epub 2016 Dec 10.
10
Deep Learning With Asymmetric Connections and Hebbian Updates.具有非对称连接和赫布更新的深度学习
Front Comput Neurosci. 2019 Apr 4;13:18. doi: 10.3389/fncom.2019.00018. eCollection 2019.

引用本文的文献

1
A polynomial proxy model approach to verifiable decentralized federated learning.一种用于可验证的去中心化联邦学习的多项式代理模型方法。
Sci Rep. 2024 Nov 20;14(1):28786. doi: 10.1038/s41598-024-79798-x.
2
Enhancing the accuracies by performing pooling decisions adjacent to the output layer.通过在输出层附近执行池化决策来提高准确率。
Sci Rep. 2023 Aug 31;13(1):13385. doi: 10.1038/s41598-023-40566-y.

本文引用的文献

1
Efficient dendritic learning as an alternative to synaptic plasticity hypothesis.高效的树突学习作为突触可塑性假说的替代方案。
Sci Rep. 2022 Apr 28;12(1):6571. doi: 10.1038/s41598-022-10466-8.
2
Burst-dependent synaptic plasticity can coordinate learning in hierarchical circuits.突发依赖性突触可塑性可以协调分层电路中的学习。
Nat Neurosci. 2021 Jul;24(7):1010-1019. doi: 10.1038/s41593-021-00857-x. Epub 2021 May 13.
3
Illuminating dendritic function with computational models.用计算模型照亮树突功能。
Nat Rev Neurosci. 2020 Jun;21(6):303-321. doi: 10.1038/s41583-020-0301-7. Epub 2020 May 11.
4
Brain experiments imply adaptation mechanisms which outperform common AI learning algorithms.大脑实验暗示了适应机制,这些机制胜过常见的人工智能学习算法。
Sci Rep. 2020 Apr 23;10(1):6923. doi: 10.1038/s41598-020-63755-5.
5
Dendritic action potentials and computation in human layer 2/3 cortical neurons.人类皮层 2/3 层神经元的树突动作电位和计算。
Science. 2020 Jan 3;367(6473):83-87. doi: 10.1126/science.aax6239.
6
A deep learning framework for neuroscience.深度学习在神经科学中的应用框架。
Nat Neurosci. 2019 Nov;22(11):1761-1770. doi: 10.1038/s41593-019-0520-2. Epub 2019 Oct 28.
7
Adaptive nodes enrich nonlinear cooperative learning beyond traditional adaptation by links.自适应节点通过链接丰富了超越传统自适应的非线性协作学习。
Sci Rep. 2018 Mar 23;8(1):5100. doi: 10.1038/s41598-018-23471-7.
8
Branching into brains.分支进入大脑。
Elife. 2017 Dec 5;6:e33066. doi: 10.7554/eLife.33066.
9
Deep learning.深度学习。
Nature. 2015 May 28;521(7553):436-44. doi: 10.1038/nature14539.
10
Deep learning in neural networks: an overview.神经网络中的深度学习:综述。
Neural Netw. 2015 Jan;61:85-117. doi: 10.1016/j.neunet.2014.09.003. Epub 2014 Oct 13.