Suppr超能文献

用于训练二阶神经网络的广义反向传播算法。

Generalized backpropagation algorithm for training second-order neural networks.

作者信息

Fan Fenglei, Cong Wenxiang, Wang Ge

机构信息

Biomedical Imaging Center, BME/CBIS, Rensselaer Polytechnic Institute, Troy, NY, USA.

出版信息

Int J Numer Method Biomed Eng. 2018 May;34(5):e2956. doi: 10.1002/cnm.2956. Epub 2018 Feb 6.

Abstract

The artificial neural network is a popular framework in machine learning. To empower individual neurons, we recently suggested that the current type of neurons could be upgraded to second-order counterparts, in which the linear operation between inputs to a neuron and the associated weights is replaced with a nonlinear quadratic operation. A single second-order neurons already have a strong nonlinear modeling ability, such as implementing basic fuzzy logic operations. In this paper, we develop a general backpropagation algorithm to train the network consisting of second-order neurons. The numerical studies are performed to verify the generalized backpropagation algorithm.

摘要

人工神经网络是机器学习中一种流行的框架。为了增强单个神经元的能力,我们最近提出可以将当前类型的神经元升级为二阶神经元,其中神经元输入与相关权重之间的线性运算被非线性二次运算所取代。单个二阶神经元已经具有很强的非线性建模能力,例如实现基本的模糊逻辑运算。在本文中,我们开发了一种通用的反向传播算法来训练由二阶神经元组成的网络。进行了数值研究以验证广义反向传播算法。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验