Suppr超能文献

具有整流线性单元的深度学习中各层和神经元的作用。

Role of Layers and Neurons in Deep Learning With the Rectified Linear Unit.

作者信息

Takekawa Akira, Kajiura Masayuki, Fukuda Hiroya

机构信息

Graduate School of Frontier Science, Konan University, Kobe, JPN.

Graduate School of Human Development and Environment, Kobe University, Kobe, JPN.

出版信息

Cureus. 2021 Oct 18;13(10):e18866. doi: 10.7759/cureus.18866. eCollection 2021 Oct.

Abstract

Deep learning is used to classify data into several groups based on nonlinear curved surfaces. In this paper, we focus on the theoretical analysis of deep learning using the rectified linear unit (ReLU) activation function. Because layers approximate a nonlinear curved surface, increasing the number of layers improves the approximation accuracy of the curved surface. While neurons perform a layer-by-layer approximation of the most appropriate hyperplanes, increasing their number cannot improve the results obtained via canonical correlation analysis (CCA). These results illustrate the functions of layers and neurons in deep learning with ReLU.

摘要

深度学习用于基于非线性曲面将数据分类为若干组。在本文中,我们专注于使用整流线性单元(ReLU)激活函数的深度学习的理论分析。由于各层近似非线性曲面,增加层数可提高曲面的近似精度。虽然神经元对最合适的超平面进行逐层近似,但增加神经元数量并不能改善通过典型相关分析(CCA)获得的结果。这些结果说明了ReLU深度学习中层和神经元的功能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/bf94/8601259/76374c477094/cureus-0013-00000018866-i01.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验