Agarwal Richa, Díaz Oliver, Yap Moi Hoon, Lladó Xavier, Martí Robert
VICOROB, Department of Computer Architecture and Technology, University of Girona, Spain.
VICOROB, Department of Computer Architecture and Technology, University of Girona, Spain; Department of Mathematics and Computer Science, University of Barcelona, Spain.
Comput Biol Med. 2020 Jun;121:103774. doi: 10.1016/j.compbiomed.2020.103774. Epub 2020 Apr 22.
In recent years, the use of Convolutional Neural Networks (CNNs) in medical imaging has shown improved performance in terms of mass detection and classification compared to current state-of-the-art methods. This paper proposes a fully automated framework to detect masses in Full-Field Digital Mammograms (FFDM). This is based on the Faster Region-based Convolutional Neural Network (Faster-RCNN) model and is applied for detecting masses in the large-scale OPTIMAM Mammography Image Database (OMI-DB), which consists of ∼80,000 FFDMs mainly from Hologic and General Electric (GE) scanners. This research is the first to benchmark the performance of deep learning on OMI-DB. The proposed framework obtained a True Positive Rate (TPR) of 0.93 at 0.78 False Positive per Image (FPI) on FFDMs from the Hologic scanner. Transfer learning is then used in the Faster R-CNN model trained on Hologic images to detect masses in smaller databases containing FFDMs from the GE scanner and another public dataset INbreast (Siemens scanner). The detection framework obtained a TPR of 0.91±0.06 at 1.69 FPI for images from the GE scanner and also showed higher performance compared to state-of-the-art methods on the INbreast dataset, obtaining a TPR of 0.99±0.03 at 1.17 FPI for malignant and 0.85±0.08 at 1.0 FPI for benign masses, showing the potential to be used as part of an advanced CAD system for breast cancer screening.
近年来,与当前最先进的方法相比,卷积神经网络(CNN)在医学成像中的应用在肿块检测和分类方面表现出了更高的性能。本文提出了一个全自动框架,用于在全场数字化乳腺钼靶(FFDM)中检测肿块。该框架基于基于区域的快速卷积神经网络(Faster-RCNN)模型,并应用于在大规模的OPTIMAM乳腺钼靶图像数据库(OMI-DB)中检测肿块,该数据库包含约80,000张FFDM图像,主要来自Hologic和通用电气(GE)的扫描仪。这项研究首次在OMI-DB上对深度学习的性能进行了基准测试。所提出的框架在来自Hologic扫描仪的FFDM上,在每幅图像0.78的误报率(FPI)下,获得了0.93的真阳性率(TPR)。然后,将迁移学习应用于在Hologic图像上训练的Faster R-CNN模型,以在包含来自GE扫描仪的FFDM和另一个公共数据集INbreast(西门子扫描仪)的较小数据库中检测肿块。对于来自GE扫描仪的图像,检测框架在1.69 FPI时获得了0.91±0.06的TPR,并且在INbreast数据集上与最先进的方法相比也表现出更高的性能,对于恶性肿块,在1.17 FPI时获得了0.99±0.03的TPR,对于良性肿块,在1.0 FPI时获得了0.85±0.08的TPR,显示出有潜力用作先进的乳腺癌筛查计算机辅助检测(CAD)系统的一部分。