• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

形态卷积神经网络架构的数字识别。

Morphological Convolutional Neural Network Architecture for Digit Recognition.

出版信息

IEEE Trans Neural Netw Learn Syst. 2019 Sep;30(9):2876-2885. doi: 10.1109/TNNLS.2018.2890334. Epub 2019 Jan 23.

DOI:10.1109/TNNLS.2018.2890334
PMID:30676985
Abstract

Deep neural networks have proved promising results in many applications and fields, but they are still assimilated to a black box. Thus, it is very useful to introduce interpretability aspects to prevent the blind application of deep networks. This paper proposed an interpretable morphological convolutional neural network called Morph-CNN for pattern recognition, where morphological operations were incorporated using counter-harmonic mean into the convolutional layer in order to generate enhanced feature maps. Morph-CNN was extensively evaluated on MNIST and SVHN benchmarks for digit recognition. The different tested configurations showed that Morph-CNN outperforms the existing methods.

摘要

深度神经网络在许多应用和领域中已经证明了其良好的效果,但它们仍然被视为一个黑盒。因此,引入可解释性方面的内容来防止盲目应用深度网络是非常有用的。本文提出了一种可解释的形态卷积神经网络,称为 Morph-CNN,用于模式识别,其中使用反调和均值将形态操作合并到卷积层中,以生成增强的特征图。Morph-CNN 在 MNIST 和 SVHN 基准数据集上进行了数字识别的广泛评估。不同的测试配置表明,Morph-CNN 优于现有的方法。

相似文献

1
Morphological Convolutional Neural Network Architecture for Digit Recognition.形态卷积神经网络架构的数字识别。
IEEE Trans Neural Netw Learn Syst. 2019 Sep;30(9):2876-2885. doi: 10.1109/TNNLS.2018.2890334. Epub 2019 Jan 23.
2
Memristor Based Binary Convolutional Neural Network Architecture With Configurable Neurons.具有可配置神经元的基于忆阻器的二进制卷积神经网络架构
Front Neurosci. 2021 Mar 26;15:639526. doi: 10.3389/fnins.2021.639526. eCollection 2021.
3
Training Deep Spiking Neural Networks Using Backpropagation.使用反向传播训练深度脉冲神经网络。
Front Neurosci. 2016 Nov 8;10:508. doi: 10.3389/fnins.2016.00508. eCollection 2016.
4
Improved Handwritten Digit Recognition Using Convolutional Neural Networks (CNN).基于卷积神经网络(CNN)的改进手写数字识别。
Sensors (Basel). 2020 Jun 12;20(12):3344. doi: 10.3390/s20123344.
5
Full depth CNN classifier for handwritten and license plate characters recognition.用于手写和车牌字符识别的全深度卷积神经网络分类器。
PeerJ Comput Sci. 2021 Jun 18;7:e576. doi: 10.7717/peerj-cs.576. eCollection 2021.
6
A multimodal convolutional neuro-fuzzy network for emotion understanding of movie clips.用于电影片段情绪理解的多模态卷积神经模糊网络。
Neural Netw. 2019 Oct;118:208-219. doi: 10.1016/j.neunet.2019.06.010. Epub 2019 Jul 2.
7
Improving efficiency in convolutional neural networks with multilinear filters.利用多元线性滤波器提高卷积神经网络的效率。
Neural Netw. 2018 Sep;105:328-339. doi: 10.1016/j.neunet.2018.05.017. Epub 2018 Jun 7.
8
Interpretable neural networks: principles and applications.可解释神经网络:原理与应用
Front Artif Intell. 2023 Oct 13;6:974295. doi: 10.3389/frai.2023.974295. eCollection 2023.
9
An effective classifier based on convolutional neural network and regularized extreme learning machine.基于卷积神经网络和正则化极限学习机的有效分类器。
Math Biosci Eng. 2019 Sep 17;16(6):8309-8321. doi: 10.3934/mbe.2019420.
10
On the Use of Concentrated Time-Frequency Representations as Input to a Deep Convolutional Neural Network: Application to Non Intrusive Load Monitoring.关于使用集中式时频表示作为深度卷积神经网络的输入:在非侵入式负载监测中的应用
Entropy (Basel). 2020 Aug 19;22(9):911. doi: 10.3390/e22090911.

引用本文的文献

1
Postprocessing for Skin Detection.皮肤检测的后处理
J Imaging. 2021 Jun 3;7(6):95. doi: 10.3390/jimaging7060095.
2
Two-Stage Feature Generator for Handwritten Digit Classification.用于手写数字分类的两阶段特征生成器
Sensors (Basel). 2023 Oct 15;23(20):8477. doi: 10.3390/s23208477.
3
Hybrid morphological-convolutional neural networks for computer-aided diagnosis.用于计算机辅助诊断的混合形态学-卷积神经网络
Front Artif Intell. 2023 Sep 19;6:1253183. doi: 10.3389/frai.2023.1253183. eCollection 2023.