• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

AresB-Net:使用捷径拼接和混洗分组卷积的精确残差二值化神经网络。

AresB-Net: accurate residual binarized neural networks using shortcut concatenation and shuffled grouped convolution.

作者信息

Kim HyunJin

机构信息

School of Electronics and Electrical Engineering, Dankook University, Yongin, South Korea.

出版信息

PeerJ Comput Sci. 2021 Mar 26;7:e454. doi: 10.7717/peerj-cs.454. eCollection 2021.

DOI:10.7717/peerj-cs.454
PMID:33834112
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC8022573/
Abstract

This article proposes a novel network model to achieve better accurate residual binarized convolutional neural networks (CNNs), denoted as AresB-Net. Even though residual CNNs enhance the classification accuracy of binarized neural networks with increasing feature resolution, the degraded classification accuracy is still the primary concern compared with real-valued residual CNNs. AresB-Net consists of novel basic blocks to amortize the severe error from the binarization, suggesting a well-balanced pyramid structure without downsampling convolution. In each basic block, the shortcut is added to the convolution output and then concatenated, and then the expanded channels are shuffled for the next grouped convolution. In the downsampling when >1, our model adopts only the max-pooling layer for generating low-cost shortcut. This structure facilitates the feature reuse from the previous layers, thus alleviating the error from the binarized convolution and increasing the classification accuracy with reduced computational costs and small weight storage requirements. Despite low hardware costs from the binarized computations, the proposed model achieves remarkable classification accuracies on the CIFAR and ImageNet datasets.

摘要

本文提出了一种新颖的网络模型,以实现更精确的残差二值化卷积神经网络(CNN),称为AresB-Net。尽管残差CNN随着特征分辨率的提高增强了二值化神经网络的分类精度,但与实值残差CNN相比,分类精度下降仍然是主要问题。AresB-Net由新颖的基本块组成,以消除二值化带来的严重误差,提出了一种没有下采样卷积的平衡金字塔结构。在每个基本块中,将捷径添加到卷积输出然后连接,然后对扩展通道进行混洗以进行下一组卷积。在步长>1的下采样中,我们的模型仅采用最大池化层来生成低成本捷径。这种结构有助于从前一层重用特征,从而减轻二值化卷积带来的误差,并以降低的计算成本和较小的权重存储需求提高分类精度。尽管二值化计算的硬件成本较低,但所提出的模型在CIFAR和ImageNet数据集上实现了显著的分类精度。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/021f/8022573/c11a681c956a/peerj-cs-07-454-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/021f/8022573/19b423497608/peerj-cs-07-454-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/021f/8022573/47d5ee02cf97/peerj-cs-07-454-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/021f/8022573/8318392383e3/peerj-cs-07-454-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/021f/8022573/c11a681c956a/peerj-cs-07-454-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/021f/8022573/19b423497608/peerj-cs-07-454-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/021f/8022573/47d5ee02cf97/peerj-cs-07-454-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/021f/8022573/8318392383e3/peerj-cs-07-454-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/021f/8022573/c11a681c956a/peerj-cs-07-454-g004.jpg

相似文献

1
AresB-Net: accurate residual binarized neural networks using shortcut concatenation and shuffled grouped convolution.AresB-Net:使用捷径拼接和混洗分组卷积的精确残差二值化神经网络。
PeerJ Comput Sci. 2021 Mar 26;7:e454. doi: 10.7717/peerj-cs.454. eCollection 2021.
2
PresB-Net: parametric binarized neural network with learnable activations and shuffled grouped convolution.PresB-Net:具有可学习激活函数和随机分组卷积的参数化二值神经网络。
PeerJ Comput Sci. 2022 Jan 3;8:e842. doi: 10.7717/peerj-cs.842. eCollection 2022.
3
A storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks.一种在二值化卷积神经网络上使用滤波器共享的存储高效集成分类方法。
PeerJ Comput Sci. 2022 Mar 29;8:e924. doi: 10.7717/peerj-cs.924. eCollection 2022.
4
Differential convolutional neural network.差异卷积神经网络。
Neural Netw. 2019 Aug;116:279-287. doi: 10.1016/j.neunet.2019.04.025. Epub 2019 May 10.
5
CQ Training: Minimizing Accuracy Loss in Conversion From Convolutional Neural Networks to Spiking Neural Networks.CQ训练:最小化从卷积神经网络转换到脉冲神经网络时的精度损失。
IEEE Trans Pattern Anal Mach Intell. 2023 Oct;45(10):11600-11611. doi: 10.1109/TPAMI.2023.3286121. Epub 2023 Sep 5.
6
Optronic convolutional neural networks of multi-layers with different functions executed in optics for image classification.用于图像分类的、在光学中执行不同功能的多层光电卷积神经网络。
Opt Express. 2021 Feb 15;29(4):5877-5889. doi: 10.1364/OE.415542.
7
A Dual Neural Architecture Combined SqueezeNet with OctConv for LiDAR Data Classification.一种结合 SqueezeNet 和 OctConv 的双重神经架构的 LiDAR 数据分类方法。
Sensors (Basel). 2019 Nov 12;19(22):4927. doi: 10.3390/s19224927.
8
Improved Residual Network based on norm-preservation for visual recognition.基于范数保持的视觉识别改进残差网络。
Neural Netw. 2023 Jan;157:305-322. doi: 10.1016/j.neunet.2022.10.023. Epub 2022 Oct 28.
9
ECG signal classification with binarized convolutional neural network.基于二值化卷积神经网络的心电图信号分类
Comput Biol Med. 2020 Jun;121:103800. doi: 10.1016/j.compbiomed.2020.103800. Epub 2020 May 5.
10
Conversion of Continuous-Valued Deep Networks to Efficient Event-Driven Networks for Image Classification.将连续值深度网络转换为用于图像分类的高效事件驱动网络
Front Neurosci. 2017 Dec 7;11:682. doi: 10.3389/fnins.2017.00682. eCollection 2017.

引用本文的文献

1
Prediction of bone oligometastases in breast cancer using models based on deep learning radiomics of PET/CT imaging.使用基于PET/CT成像深度学习影像组学的模型预测乳腺癌骨寡转移。
Front Oncol. 2025 Aug 21;15:1621677. doi: 10.3389/fonc.2025.1621677. eCollection 2025.
2
A storage-efficient ensemble classification using filter sharing on binarized convolutional neural networks.一种在二值化卷积神经网络上使用滤波器共享的存储高效集成分类方法。
PeerJ Comput Sci. 2022 Mar 29;8:e924. doi: 10.7717/peerj-cs.924. eCollection 2022.
3
PresB-Net: parametric binarized neural network with learnable activations and shuffled grouped convolution.
PresB-Net:具有可学习激活函数和随机分组卷积的参数化二值神经网络。
PeerJ Comput Sci. 2022 Jan 3;8:e842. doi: 10.7717/peerj-cs.842. eCollection 2022.