Suppr超能文献

基于两阶段卷积神经网络的乳腺超声图像 BI-RADS 分类

Two-stage CNNs for computerized BI-RADS categorization in breast ultrasound images.

机构信息

Department of Biomedical Engineering, College of Materials Science and Engineering, Sichuan University, Chengdu, 610065, China.

College of Electrical Engineering and Information Technology, Sichuan University, Chengdu, 610065, China.

出版信息

Biomed Eng Online. 2019 Jan 24;18(1):8. doi: 10.1186/s12938-019-0626-5.

Abstract

BACKGROUND

Quantizing the Breast Imaging Reporting and Data System (BI-RADS) criteria into different categories with the single ultrasound modality has always been a challenge. To achieve this, we proposed a two-stage grading system to automatically evaluate breast tumors from ultrasound images into five categories based on convolutional neural networks (CNNs).

METHODS

This new developed automatic grading system was consisted of two stages, including the tumor identification and the tumor grading. The constructed network for tumor identification, denoted as ROI-CNN, can identify the region contained the tumor from the original breast ultrasound images. The following tumor categorization network, denoted as G-CNN, can generate effective features for differentiating the identified regions of interest (ROIs) into five categories: Category "3", Category "4A", Category "4B", Category "4C", and Category "5". Particularly, to promote the predictions identified by the ROI-CNN better tailor to the tumor, refinement procedure based on Level-set was leveraged as a joint between the stage and grading stage.

RESULTS

We tested the proposed two-stage grading system against 2238 cases with breast tumors in ultrasound images. With the accuracy as an indicator, our automatic computerized evaluation for grading breast tumors exhibited a performance comparable to that of subjective categories determined by physicians. Experimental results show that our two-stage framework can achieve the accuracy of 0.998 on Category "3", 0.940 on Category "4A", 0.734 on Category "4B", 0.922 on Category "4C", and 0.876 on Category "5".

CONCLUSION

The proposed scheme can extract effective features from the breast ultrasound images for the final classification of breast tumors by decoupling the identification features and classification features with different CNNs. Besides, the proposed scheme can extend the diagnosing of breast tumors in ultrasound images to five sub-categories according to BI-RADS rather than merely distinguishing the breast tumor malignant from benign.

摘要

背景

将乳腺影像报告和数据系统(BI-RADS)标准量化为不同类别,仅使用超声模态一直是一个挑战。为了实现这一目标,我们提出了一种两阶段分级系统,该系统基于卷积神经网络(CNN)自动将乳腺超声图像中的肿瘤评估为五个类别。

方法

新开发的自动分级系统由两个阶段组成,包括肿瘤识别和肿瘤分级。用于肿瘤识别的构建网络,记为 ROI-CNN,可以从原始乳腺超声图像中识别包含肿瘤的区域。随后的肿瘤分类网络,记为 G-CNN,可以生成有效特征,将识别出的感兴趣区域(ROI)分为五类:类别“3”、类别“4A”、类别“4B”、类别“4C”和类别“5”。特别是,为了促进 ROI-CNN 识别的预测更好地适应肿瘤,我们利用基于水平集的细化过程作为阶段和分级阶段之间的连接。

结果

我们在包含 2238 例乳腺肿瘤的超声图像中测试了所提出的两阶段分级系统。以准确性为指标,我们对分级乳腺肿瘤的自动计算机评估表现与医师主观分类相当。实验结果表明,我们的两阶段框架在类别“3”上的准确率为 0.998,在类别“4A”上的准确率为 0.940,在类别“4B”上的准确率为 0.734,在类别“4C”上的准确率为 0.922,在类别“5”上的准确率为 0.876。

结论

所提出的方案可以通过使用不同的 CNN 解耦识别特征和分类特征,从乳腺超声图像中提取有效特征,最终对乳腺肿瘤进行分类。此外,该方案可以根据 BI-RADS 将乳腺超声图像中的肿瘤诊断扩展到五个亚类,而不仅仅是区分乳腺肿瘤的良恶性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/d712/6346503/bb6924b35c12/12938_2019_626_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验