• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

卷积神经网络在彩色图像有效烧伤区域分割中的应用。

Convolution neural network for effective burn region segmentation of color images.

机构信息

Center for Biomedical Engineering, Indian Institute of Technology Ropar, Punjab, India.

Center for Biomedical Engineering, Indian Institute of Technology Ropar, Punjab, India; Computer Science and Engineering, Indian Institute of Technology Ropar, Punjab, India.

出版信息

Burns. 2021 Jun;47(4):854-862. doi: 10.1016/j.burns.2020.08.016. Epub 2020 Sep 12.

DOI:10.1016/j.burns.2020.08.016
PMID:33158632
Abstract

BACKGROUND

Burn injuries are one of the most severe forms of wounds and trauma across the globe. Automated burn diagnosis methods are needed to provide timely treatment to the concerned patients. Artificial intelligence is playing a vital role in developing automated tools and techniques for medical problems. However, the use of advanced AI techniques for color images based burn region segmentation is not much explored.

METHOD

In this work, we explore the use of deep learning for the challenging problem of burn region segmentation. We prepared a pixel-wise labelled new burn images dataset for segmentation and investigated the efficacy of existing state-of-the-art color images based semantic image segmentation techniques. Lately, we proposed a new convolution neural network (CNN) that uses atrous convolution for encoding rich contextual information and utilizes pre-trained model ResNet-101 for better extraction of low-level and middle-level layer features.

RESULTS

The proposed approach achieves the state-of-the-art performance on the prepared burn image dataset with 77.6% of Mathews correlation coefficient (MCC) and 93.4% of accuracy. The improvement of 11.6/5.8/6.9/1.2% is observed in precision, Dice similarity coefficient, Jaccard index and specificity, in comparison to the second best performance.

CONCLUSION

In this work, we propose a CNN based novel method for performing burn-region segmentation in color images and evaluate it using newly prepared Burn Images dataset. The experimental results illustrate its effectiveness in comparison to existing approaches. Further, the proposed pixel-level segmentation method could be useful in estimating the burn surface area and burn severity in an accurate and time efficient manner.

摘要

背景

烧伤是全球最严重的创伤之一。需要自动化的烧伤诊断方法来为相关患者提供及时的治疗。人工智能在开发用于医疗问题的自动化工具和技术方面发挥着至关重要的作用。然而,基于颜色图像的高级 AI 技术在烧伤区域分割中的应用尚未得到充分探索。

方法

在这项工作中,我们探索了深度学习在烧伤区域分割这一具有挑战性问题中的应用。我们准备了一个像素级标记的新烧伤图像数据集用于分割,并研究了现有的基于颜色图像的语义图像分割技术的效果。最近,我们提出了一种新的卷积神经网络(CNN),该网络使用空洞卷积来编码丰富的上下文信息,并利用预训练的 ResNet-101 模型来更好地提取低层次和中间层次的特征。

结果

所提出的方法在准备好的烧伤图像数据集上取得了最先进的性能,马修斯相关系数(MCC)为 77.6%,准确率为 93.4%。与第二好的性能相比,精度、Dice 相似系数、Jaccard 指数和特异性分别提高了 11.6/5.8/6.9/1.2%。

结论

在这项工作中,我们提出了一种基于 CNN 的新方法,用于对彩色图像进行烧伤区域分割,并使用新准备的烧伤图像数据集对其进行评估。实验结果表明,与现有方法相比,该方法具有有效性。此外,所提出的像素级分割方法可以在准确和高效的方式中用于估计烧伤面积和烧伤严重程度。

相似文献

1
Convolution neural network for effective burn region segmentation of color images.卷积神经网络在彩色图像有效烧伤区域分割中的应用。
Burns. 2021 Jun;47(4):854-862. doi: 10.1016/j.burns.2020.08.016. Epub 2020 Sep 12.
2
A multiple-channel and atrous convolution network for ultrasound image segmentation.一种用于超声图像分割的多通道多孔卷积网络。
Med Phys. 2020 Dec;47(12):6270-6285. doi: 10.1002/mp.14512. Epub 2020 Oct 18.
3
Burn image segmentation based on Mask Regions with Convolutional Neural Network deep learning framework: more accurate and more convenient.基于卷积神经网络深度学习框架的掩码区域烧伤图像分割:更准确、更便捷。
Burns Trauma. 2019 Feb 28;7:6. doi: 10.1186/s41038-018-0137-9. eCollection 2019.
4
Burn Images Segmentation Based on Burn-GAN.基于烧伤生成对抗网络的烧伤图像分割
J Burn Care Res. 2021 Aug 4;42(4):755-762. doi: 10.1093/jbcr/iraa208.
5
BPBSAM: Body part-specific burn severity assessment model.BPBSAM:人体部位特定烧伤严重程度评估模型。
Burns. 2020 Sep;46(6):1407-1423. doi: 10.1016/j.burns.2020.03.007. Epub 2020 May 4.
6
Improving burn depth assessment for pediatric scalds by AI based on semantic segmentation of polarized light photography images.基于偏振光摄影图像语义分割的人工智能改善小儿烫伤深度评估
Burns. 2021 Nov;47(7):1586-1593. doi: 10.1016/j.burns.2021.01.011. Epub 2021 Feb 8.
7
Coarse-to-fine airway segmentation using multi information fusion network and CNN-based region growing.基于多信息融合网络和基于 CNN 的区域生长的粗到细气道分割。
Comput Methods Programs Biomed. 2022 Mar;215:106610. doi: 10.1016/j.cmpb.2021.106610. Epub 2022 Jan 8.
8
Znet: Deep Learning Approach for 2D MRI Brain Tumor Segmentation.Znet:二维 MRI 脑肿瘤分割的深度学习方法。
IEEE J Transl Eng Health Med. 2022 May 23;10:1800508. doi: 10.1109/JTEHM.2022.3176737. eCollection 2022.
9
ADR-Net: Context extraction network based on M-Net for medical image segmentation.ADR-Net:基于M-Net的医学图像分割上下文提取网络。
Med Phys. 2020 Sep;47(9):4254-4264. doi: 10.1002/mp.14364. Epub 2020 Aug 2.
10
Cascaded atrous convolution and spatial pyramid pooling for more accurate tumor target segmentation for rectal cancer radiotherapy.级联空洞卷积和空间金字塔池化以提高直肠癌放疗中肿瘤靶区分割的准确性。
Phys Med Biol. 2018 Sep 17;63(18):185016. doi: 10.1088/1361-6560/aada6c.

引用本文的文献

1
Semi-Supervised Burn Depth Segmentation Network with Contrast Learning and Uncertainty Correction.具有对比学习和不确定性校正的半监督烧伤深度分割网络
Sensors (Basel). 2025 Feb 10;25(4):1059. doi: 10.3390/s25041059.
2
Review of machine learning for optical imaging of burn wound severity assessment.机器学习在烧伤创面严重程度评估光学成像中的应用综述。
J Biomed Opt. 2024 Feb;29(2):020901. doi: 10.1117/1.JBO.29.2.020901. Epub 2024 Feb 15.
3
Towards Home-Based Diabetic Foot Ulcer Monitoring: A Systematic Review.基于家庭的糖尿病足溃疡监测:系统评价。
Sensors (Basel). 2023 Mar 30;23(7):3618. doi: 10.3390/s23073618.
4
Development and evaluation of deep learning algorithms for assessment of acute burns and the need for surgery.深度学习算法在评估急性烧伤和手术需求方面的开发和评估。
Sci Rep. 2023 Jan 31;13(1):1794. doi: 10.1038/s41598-023-28164-4.