• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

GA-Net:幽灵卷积自适应融合皮肤病变分割网络。

GA-Net: Ghost convolution adaptive fusion skin lesion segmentation network.

机构信息

School of Electrical Engineering and Automation, Jiangxi University of Science and Technology, Ganzhou, Jiangxi, 341000, China; Jinguan Copper Branch of Tongling Nonferrous Metals Group Co, Ltd, Tongling, Anhui, 244100, China.

School of Electrical Engineering and Automation, Jiangxi University of Science and Technology, Ganzhou, Jiangxi, 341000, China.

出版信息

Comput Biol Med. 2023 Sep;164:107273. doi: 10.1016/j.compbiomed.2023.107273. Epub 2023 Jul 27.

DOI:10.1016/j.compbiomed.2023.107273
PMID:37562327
Abstract

Automatic segmentation of skin lesions is a pivotal task in computer-aided diagnosis, playing a crucial role in the early detection and treatment of skin cancer. Despite the existence of numerous deep learning-based segmentation methods, the extraction of lesion features remains inadequate as a result of the segmentation process. Consequently, skin lesion image segmentation continues to face challenges regarding missing detailed information and inaccurate segmentation of the lesion region. In this paper, we propose a ghost convolution adaptive fusion network for skin lesion segmentation. First, the neural network incorporates a ghost module instead of the ordinary convolution layer, generating a rich skin lesion feature map for comprehensive target feature extraction. Subsequently, the network employs an adaptive fusion module and bilateral attention module to connect the encoding and decoding layers, facilitating the integration of shallow and deep network information. Moreover, multi-level output patterns are used for pixel prediction. Layer feature fusion effectively combines output features of different scales, thus improving image segmentation accuracy. The proposed network was extensively evaluated on three publicly available datasets: ISIC2016, ISIC2017, and ISIC2018. The experimental results demonstrated accuracies of 96.42%, 94.07%, and 95.03%, and kappa coefficients of 90.41%, 81.08%, and 86.96%, respectively. The overall performance of our network surpassed that of existing networks. Simulation experiments further revealed that the ghost convolution adaptive fusion network exhibited superior segmentation results for skin lesion images, offering new possibilities for the diagnosis of skin diseases.

摘要

皮肤病变的自动分割是计算机辅助诊断中的一个关键任务,对于皮肤癌的早期检测和治疗起着至关重要的作用。尽管存在许多基于深度学习的分割方法,但由于分割过程的原因,病变特征的提取仍然不够充分。因此,皮肤病变图像分割仍然面临着缺失详细信息和病变区域分割不准确的挑战。在本文中,我们提出了一种用于皮肤病变分割的幽灵卷积自适应融合网络。首先,神经网络采用幽灵模块代替普通卷积层,生成丰富的皮肤病变特征图,进行全面的目标特征提取。然后,网络采用自适应融合模块和双边注意模块连接编码和解码层,促进浅层和深层网络信息的融合。此外,还使用了多层次的输出模式进行像素预测。层特征融合有效地结合了不同尺度的输出特征,从而提高了图像分割的准确性。该网络在三个公开可用的数据集上进行了广泛评估:ISIC2016、ISIC2017 和 ISIC2018。实验结果表明,该网络的准确率分别为 96.42%、94.07%和 95.03%,kappa 系数分别为 90.41%、81.08%和 86.96%。与现有的网络相比,我们的网络整体性能更好。仿真实验进一步表明,幽灵卷积自适应融合网络对皮肤病变图像具有优越的分割效果,为皮肤疾病的诊断提供了新的可能性。

相似文献

1
GA-Net: Ghost convolution adaptive fusion skin lesion segmentation network.GA-Net:幽灵卷积自适应融合皮肤病变分割网络。
Comput Biol Med. 2023 Sep;164:107273. doi: 10.1016/j.compbiomed.2023.107273. Epub 2023 Jul 27.
2
HMA-Net: A deep U-shaped network combined with HarDNet and multi-attention mechanism for medical image segmentation.HMA-Net:一种结合 HarDNet 和多注意力机制的深度 U 形网络,用于医学图像分割。
Med Phys. 2023 Mar;50(3):1635-1646. doi: 10.1002/mp.16065. Epub 2022 Nov 3.
3
Intelligent skin lesion segmentation using deformable attention Transformer U-Net with bidirectional attention mechanism in skin cancer images.在皮肤癌图像中使用具有双向注意力机制的可变形注意力Transformer U-Net进行智能皮肤病变分割。
Skin Res Technol. 2024 Aug;30(8):e13783. doi: 10.1111/srt.13783.
4
ELA-Net: An Efficient Lightweight Attention Network for Skin Lesion Segmentation.ELA-Net:一种用于皮肤病变分割的高效轻量级注意力网络。
Sensors (Basel). 2024 Jul 2;24(13):4302. doi: 10.3390/s24134302.
5
ACCPG-Net: A skin lesion segmentation network with Adaptive Channel-Context-Aware Pyramid Attention and Global Feature Fusion.ACCPG-Net:一种具有自适应通道-上下文感知金字塔注意力和全局特征融合的皮肤病变分割网络。
Comput Biol Med. 2023 Mar;154:106580. doi: 10.1016/j.compbiomed.2023.106580. Epub 2023 Jan 25.
6
BLA-Net:Boundary learning assisted network for skin lesion segmentation.BLA-Net:用于皮肤病变分割的边界学习辅助网络。
Comput Methods Programs Biomed. 2022 Nov;226:107190. doi: 10.1016/j.cmpb.2022.107190. Epub 2022 Oct 19.
7
PMJAF-Net: Pyramidal multi-scale joint attention and adaptive fusion network for explainable skin lesion segmentation.PMJAF-Net:用于可解释皮肤病变分割的金字塔多尺度联合注意和自适应融合网络。
Comput Biol Med. 2023 Oct;165:107454. doi: 10.1016/j.compbiomed.2023.107454. Epub 2023 Sep 12.
8
ULFAC-Net: Ultra-Lightweight Fully Asymmetric Convolutional Network for Skin Lesion Segmentation.ULFAC-Net:用于皮肤病变分割的超轻量级全不对称卷积网络。
IEEE J Biomed Health Inform. 2023 Jun;27(6):2886-2897. doi: 10.1109/JBHI.2023.3259802. Epub 2023 Jun 5.
9
Transformer guided self-adaptive network for multi-scale skin lesion image segmentation.Transformer 引导的自适网络用于多尺度皮肤病变图像分割。
Comput Biol Med. 2024 Feb;169:107846. doi: 10.1016/j.compbiomed.2023.107846. Epub 2023 Dec 23.
10
DBNet-SI: Dual branch network of shift window attention and inception structure for skin lesion segmentation.DBNet-SI:用于皮肤病变分割的移位窗口注意力和 inception 结构的双分支网络。
Comput Biol Med. 2024 Mar;170:108090. doi: 10.1016/j.compbiomed.2024.108090. Epub 2024 Feb 2.

引用本文的文献

1
sEMG-based gesture recognition using multi-stream adaptive CNNs with integrated residual modules.基于表面肌电图的手势识别:使用带有集成残差模块的多流自适应卷积神经网络
Front Bioeng Biotechnol. 2025 Apr 29;13:1487020. doi: 10.3389/fbioe.2025.1487020. eCollection 2025.
2
Identification of Lighting Strike Damage and Prediction of Residual Strength of Carbon Fiber-Reinforced Polymer Laminates Using a Machine Learning Approach.基于机器学习方法的碳纤维增强聚合物层压板雷击损伤识别与剩余强度预测
Polymers (Basel). 2025 Jan 13;17(2):180. doi: 10.3390/polym17020180.