Shi Jun, Wang Ruoyu, Zheng Yushan, Jiang Zhiguo, Zhang Haopeng, Yu Lanlan
School of Software, Hefei University of Technology, Hefei 230601, China.
Image Processing Center, School of Astronautics, Beihang University, Beijing, 100191, China; Beijing Advanced Innovation Center for Biomedical Engineering, Beihang University, Beijing, 100191, China; Beijing Key Laboratory of Digital Media, Beihang University, Beijing, 100191, China.
Comput Methods Programs Biomed. 2021 Jan;198:105807. doi: 10.1016/j.cmpb.2020.105807. Epub 2020 Oct 22.
Cervical cell classification has important clinical significance in cervical cancer screening at early stages. In contrast with the conventional classification methods which depend on hand-crafted or engineered features, Convolutional Neural Network (CNN) generally classifies cervical cells via learned deep features. However, the latent correlations of images may be ignored during CNN feature learning and thus influence the representation ability of CNN features.
We propose a novel cervical cell classification method based on Graph Convolutional Network (GCN). It aims to explore the potential relationship of cervical cell images for improving the classification performance. The CNN features of all the cervical cell images are firstly clustered and the intrinsic relationships of images can be preliminarily revealed through the clustering. To further capture the underlying correlations existed among clusters, a graph structure is constructed. GCN is then applied to propagate the node dependencies and thus yield the relation-aware feature representation. The GCN features are finally incorporated to enhance the discriminative ability of CNN features.
Experiments on the public cervical cell image dataset SIPaKMeD from International Conference on Image Processing in 2018 demonstrate the feasibility and effectiveness of the proposed method. In addition, we introduce a large-scale Motic liquid-based cytology image dataset which provides the large amount of data, some novel cell types with important clinical significance and staining difference and thus presents a great challenge for cervical cell classification. We evaluate the proposed method under two conditions of the consistent staining and different staining. Experimental results show our method outperforms the existing state-of-arts methods according to the quantitative metrics (i.e. accuracy, sensitivity, specificity, F-measure and confusion matrices).
The intrinsic relationship exploration of cervical cells contributes significant improvements to the cervical cell classification. The relation-aware features generated by GCN effectively strengthens the representational power of CNN features. The proposed method can achieve the better classification performance and also can be potentially used in automatic screening system of cervical cytology.
宫颈细胞分类在宫颈癌早期筛查中具有重要的临床意义。与依赖手工制作或工程特征的传统分类方法不同,卷积神经网络(CNN)通常通过学习到的深度特征对宫颈细胞进行分类。然而,在CNN特征学习过程中,图像的潜在相关性可能会被忽略,从而影响CNN特征的表示能力。
我们提出了一种基于图卷积网络(GCN)的新型宫颈细胞分类方法。其目的是探索宫颈细胞图像的潜在关系,以提高分类性能。首先对所有宫颈细胞图像的CNN特征进行聚类,通过聚类可以初步揭示图像的内在关系。为了进一步捕捉聚类之间存在的潜在相关性,构建了一个图结构。然后应用GCN传播节点依赖性,从而产生关系感知特征表示。最后将GCN特征合并,以增强CNN特征的判别能力。
在2018年图像处理国际会议的公共宫颈细胞图像数据集SIPaKMeD上进行的实验证明了所提方法的可行性和有效性。此外,我们引入了一个大规模的Motic液基细胞学图像数据集,该数据集提供了大量数据、一些具有重要临床意义的新型细胞类型以及染色差异,因此对宫颈细胞分类提出了巨大挑战。我们在染色一致和染色不同两种条件下评估了所提方法。实验结果表明,根据定量指标(即准确率、灵敏度、特异性、F值和混淆矩阵),我们的方法优于现有最先进的方法。
对宫颈细胞内在关系的探索显著提高了宫颈细胞分类的性能。GCN生成的关系感知特征有效地增强了CNN特征的表示能力。所提方法能够实现更好的分类性能,并且有可能应用于宫颈细胞学自动筛查系统。