Saffari Nasibeh, Rashwan Hatem A, Abdel-Nasser Mohamed, Kumar Singh Vivek, Arenas Meritxell, Mangina Eleni, Herrera Blas, Puig Domenec
Intelligent Robotics and Computer Vision Group, Department of Computer Engineering and Mathematics, Universitat Rovira i Virgili, 43007 Tarragona, Spain.
Department of Electrical Engineering, Aswan University, Aswan 81542, Egypt.
Diagnostics (Basel). 2020 Nov 23;10(11):988. doi: 10.3390/diagnostics10110988.
Breast density estimation with visual evaluation is still challenging due to low contrast and significant fluctuations in the mammograms' fatty tissue background. The primary key to breast density classification is to detect the dense tissues in the mammographic images correctly. Many methods have been proposed for breast density estimation; nevertheless, most of them are not fully automated. Besides, they have been badly affected by low signal-to-noise ratio and variability of density in appearance and texture. This study intends to develop a fully automated and digitalized breast tissue segmentation and classification using advanced deep learning techniques. The conditional Generative Adversarial Networks (cGAN) network is applied to segment the dense tissues in mammograms. To have a complete system for breast density classification, we propose a Convolutional Neural Network (CNN) to classify mammograms based on the standardization of Breast Imaging-Reporting and Data System (BI-RADS). The classification network is fed by the segmented masks of dense tissues generated by the cGAN network. For screening mammography, 410 images of 115 patients from the INbreast dataset were used. The proposed framework can segment the dense regions with an accuracy, Dice coefficient, Jaccard index of 98%, 88%, and 78%, respectively. Furthermore, we obtained precision, sensitivity, and specificity of 97.85%, 97.85%, and 99.28%, respectively, for breast density classification. This study's findings are promising and show that the proposed deep learning-based techniques can produce a clinically useful computer-aided tool for breast density analysis by digital mammography.
由于乳房X光片的脂肪组织背景对比度低且波动较大,通过视觉评估进行乳房密度估计仍然具有挑战性。乳房密度分类的关键在于正确检测乳房X光图像中的致密组织。已经提出了许多用于乳房密度估计的方法;然而,其中大多数方法并非完全自动化。此外,它们还受到低信噪比以及密度在外观和纹理上的变异性的严重影响。本研究旨在使用先进的深度学习技术开发一种完全自动化和数字化的乳房组织分割与分类方法。条件生成对抗网络(cGAN)被应用于分割乳房X光片中的致密组织。为了构建一个完整的乳房密度分类系统,我们提出了一种基于卷积神经网络(CNN)的方法,根据乳腺影像报告和数据系统(BI-RADS)的标准对乳房X光片进行分类。分类网络由cGAN网络生成的致密组织分割掩码提供输入。对于筛查性乳房X光检查,使用了来自INbreast数据集的115名患者的410张图像。所提出的框架能够分别以98%、88%和78%的准确率、Dice系数和Jaccard指数分割致密区域。此外,在乳房密度分类方面,我们分别获得了97.85%、97.85%和99.28%的精度、灵敏度和特异性。本研究的结果很有前景,表明所提出的基于深度学习的技术可以通过数字化乳房X光检查产生一种临床上有用的计算机辅助乳房密度分析工具。