Cao Yuzhen, Ma Huizhan, Fan Yinuo, Liu Yuzhen, Zhang Haifeng, Cao Chengcheng, Yu Hui
School of Precision Instrument and Opto-Electronics Engineering, Tianjin University, Tianjin, China.
Academy of Medical Engineering and Translational Medicine, Tianjin University, Tianjin, China.
Technol Health Care. 2023;31(2):527-538. doi: 10.3233/THC-220141.
Colposcopy is one of the common methods of cervical cancer screening. The type of cervical transformation zone is considered one of the important factors for grading colposcopic findings and choosing treatment.
This study aims to develop a deep learning-based method for automatic classification of cervical transformation zone from colposcopy images.
We proposed a multiscale feature fusion classification network to classify cervical transformation zone, which can extract features from images and fuse them at multiple scales. Cervical regions were first detected from original colposcopy images and then fed into our multiscale feature fusion classification network.
The results on the test dataset showed that, compared with the state-of-the-art image classification models, the proposed classification network had the highest classification accuracy, reaching 88.49%, and the sensitivity to type 1, type 2 and type 3 were 90.12%, 85.95% and 89.45%, respectively, higher than the comparison methods.
The proposed method can automatically classify cervical transformation zone in colposcopy images, and can be used as an auxiliary tool in cervical cancer screening.
阴道镜检查是宫颈癌筛查的常用方法之一。宫颈转化区类型被认为是阴道镜检查结果分级和选择治疗方法的重要因素之一。
本研究旨在开发一种基于深度学习的方法,用于从阴道镜图像中自动分类宫颈转化区。
我们提出了一种多尺度特征融合分类网络来对宫颈转化区进行分类,该网络可以从图像中提取特征并在多个尺度上进行融合。首先从原始阴道镜图像中检测出宫颈区域,然后将其输入到我们的多尺度特征融合分类网络中。
测试数据集的结果表明,与现有最先进的图像分类模型相比,所提出的分类网络具有最高的分类准确率,达到88.49%,对1型、2型和3型的敏感度分别为90.12%、85.95%和89.45%,高于比较方法。
所提出的方法可以自动对阴道镜图像中的宫颈转化区进行分类,并可作为宫颈癌筛查的辅助工具。