Suppr超能文献

卷积神经网络中的卷积

Convolution in Convolution for Network in Network.

出版信息

IEEE Trans Neural Netw Learn Syst. 2018 May;29(5):1587-1597. doi: 10.1109/TNNLS.2017.2676130. Epub 2017 Mar 16.

Abstract

Network in network (NiN) is an effective instance and an important extension of deep convolutional neural network consisting of alternating convolutional layers and pooling layers. Instead of using a linear filter for convolution, NiN utilizes shallow multilayer perceptron (MLP), a nonlinear function, to replace the linear filter. Because of the powerfulness of MLP and convolutions in spatial domain, NiN has stronger ability of feature representation and hence results in better recognition performance. However, MLP itself consists of fully connected layers that give rise to a large number of parameters. In this paper, we propose to replace dense shallow MLP with sparse shallow MLP. One or more layers of the sparse shallow MLP are sparely connected in the channel dimension or channel-spatial domain. The proposed method is implemented by applying unshared convolution across the channel dimension and applying shared convolution across the spatial dimension in some computational layers. The proposed method is called convolution in convolution (CiC). The experimental results on the CIFAR10 data set, augmented CIFAR10 data set, and CIFAR100 data set demonstrate the effectiveness of the proposed CiC method.

摘要

网络中的网络(NiN)是深度卷积神经网络的一种有效实例和重要扩展,它由交替的卷积层和池化层组成。NiN 没有使用线性滤波器进行卷积,而是使用浅层多层感知机(MLP),一种非线性函数,来代替线性滤波器。由于 MLP 和空间域卷积的强大功能,NiN 具有更强的特征表示能力,从而获得更好的识别性能。然而,MLP 本身由全连接层组成,这导致了大量的参数。在本文中,我们提出用稀疏浅层 MLP 代替密集浅层 MLP。稀疏浅层 MLP 的一层或多层在通道维或通道空间域中稀疏连接。所提出的方法通过在通道维上应用非共享卷积,以及在某些计算层上在空间维上应用共享卷积来实现。所提出的方法称为卷积中的卷积(CiC)。在 CIFAR10 数据集、扩充的 CIFAR10 数据集和 CIFAR100 数据集上的实验结果证明了所提出的 CiC 方法的有效性。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验