Suppr超能文献

使用混合CNN-RNN深度学习模型鉴别动态乳腺热成像中的肿瘤组织。

The Use of Hybrid CNN-RNN Deep Learning Models to Discriminate Tumor Tissue in Dynamic Breast Thermography.

作者信息

Munguía-Siu Andrés, Vergara Irene, Espinoza-Rodríguez Juan Horacio

机构信息

Department of Computing, Electronics and Mechatronics, Universidad de las Américas Puebla, Sta. Catarina Martir, San Andrés Cholula 72810, Mexico.

Department of Immunology, Instituto de Investigaciones Biomédicas, Universidad Nacional Autónoma de México, Mexico City 04510, Mexico.

出版信息

J Imaging. 2024 Dec 21;10(12):329. doi: 10.3390/jimaging10120329.

Abstract

Breast cancer is one of the leading causes of death for women worldwide, and early detection can help reduce the death rate. Infrared thermography has gained popularity as a non-invasive and rapid method for detecting this pathology and can be further enhanced by applying neural networks to extract spatial and even temporal data derived from breast thermographic images if they are acquired sequentially. In this study, we evaluated hybrid convolutional-recurrent neural network (CNN-RNN) models based on five state-of-the-art pre-trained CNN architectures coupled with three RNNs to discern tumor abnormalities in dynamic breast thermographic images. The hybrid architecture that achieved the best performance for detecting breast cancer was VGG16-LSTM, which showed accuracy (ACC), sensitivity (SENS), and specificity (SPEC) of 95.72%, 92.76%, and 98.68%, respectively, with a CPU runtime of 3.9 s. However, the hybrid architecture that showed the fastest CPU runtime was AlexNet-RNN with 0.61 s, although with lower performance (ACC: 80.59%, SENS: 68.52%, SPEC: 92.76%), but still superior to AlexNet (ACC: 69.41%, SENS: 52.63%, SPEC: 86.18%) with 0.44 s. Our findings show that hybrid CNN-RNN models outperform stand-alone CNN models, indicating that temporal data recovery from dynamic breast thermographs is possible without significantly compromising classifier runtime.

摘要

乳腺癌是全球女性主要死因之一,早期检测有助于降低死亡率。红外热成像作为一种检测这种病症的非侵入性快速方法已受到广泛关注,如果按顺序获取乳房热成像图像,通过应用神经网络提取源自乳房热成像图像的空间甚至时间数据,该方法还可得到进一步增强。在本研究中,我们基于五种最先进的预训练卷积神经网络(CNN)架构与三种循环神经网络(RNN)评估了混合卷积循环神经网络(CNN-RNN)模型,以辨别动态乳房热成像图像中的肿瘤异常。检测乳腺癌性能最佳的混合架构是VGG16-LSTM,其准确率(ACC)、灵敏度(SENS)和特异性(SPEC)分别为95.72%、92.76%和98.68%,在CPU上的运行时间为3.9秒。然而,CPU运行时间最快的混合架构是AlexNet-RNN,为0.61秒,尽管其性能较低(ACC:80.59%,SENS:68.52%,SPEC:92.76%),但仍优于运行时间为0.44秒的AlexNet(ACC:69.41%,SENS:52.63%,SPEC:86.18%)。我们的研究结果表明,混合CNN-RNN模型优于单独的CNN模型,这表明从动态乳房热成像图中恢复时间数据是可行的,且不会显著影响分类器的运行时间。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2462/11728322/e66dfd035352/jimaging-10-00329-g0A1.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验