School of Information Engineering, Jiangxi University of Science and Technology, Ganzhou, 341000, People's Republic of China.
School of Mathematical and Computer Sciences, Heriot-Watt University, Edinburgh, EH14 4AS, UK.
Sci Rep. 2020 Sep 1;10(1):14361. doi: 10.1038/s41598-020-71431-x.
To better address the recognition of abnormalities among mammographic images, in this study we apply the deep fusion learning approach based on Pre-trained models to discover the discriminative patterns between Normal and Tumor categories. We designed a deep fusion learning framework for mammographic image classification. This framework works in two main steps. After obtaining the regions of interest (ROIs) from original dataset, the first step is to train our proposed deep fusion models on those ROI patches which are randomly collected from all ROIs. We proposed the deep fusion model (Model1) to directly fuse the deep features to classify the Normal and Tumor ROI patches. To explore the association among channels of the same block, we propose another deep fusion model (Model2) to integrate the cross-channel deep features using 1 × 1 convolution. The second step is to obtain the final prediction by performing the majority voting on all patches' prediction of one ROI. The experimental results show that Model1 achieves the whole accuracy of 0.8906, recall rate of 0.913, and precision rate of 0.8077 for Tumor class. Accordingly, Model2 achieves the whole accuracy of 0.875, recall rate of 0.9565, and precision rate 0.7,586 for Tumor class. Finally, we open source our Python code at https://github.com/yxchspring/MIAS in order to share our tool with the research community.
为了更好地识别乳腺图像中的异常,在这项研究中,我们应用基于预训练模型的深度融合学习方法来发现正常和肿瘤类别之间的区分模式。我们设计了一个用于乳腺图像分类的深度融合学习框架。该框架主要分为两个步骤。在从原始数据集获取感兴趣区域(ROI)后,第一步是在从所有 ROI 随机收集的 ROI 补丁上训练我们提出的深度融合模型。我们提出了深度融合模型(Model1),直接融合深度特征来对正常和肿瘤 ROI 补丁进行分类。为了探索同一块通道之间的关联,我们提出了另一个深度融合模型(Model2),使用 1×1 卷积来整合跨通道的深度特征。第二步是通过对一个 ROI 所有补丁的预测进行多数投票,获得最终的预测。实验结果表明,Model1 对肿瘤类别的整体准确率为 0.8906,召回率为 0.913,精确率为 0.8077。相应地,Model2 对肿瘤类别的整体准确率为 0.875,召回率为 0.9565,精确率为 0.7586。最后,我们在 https://github.com/yxchspring/MIAS 上开源了我们的 Python 代码,以便与研究社区分享我们的工具。