Kim Sungyeup, Rim Beanbonyka, Choi Seongjun, Lee Ahyoung, Min Sedong, Hong Min
Department of Software Convergence, Soonchunhyang University, Asan 31538, Korea.
Department of Otolaryngology-Head and Neck Surgery, Cheonan Hospital, Soonchunhyang University College of Medicine, Cheonan 31151, Korea.
Diagnostics (Basel). 2022 Apr 6;12(4):915. doi: 10.3390/diagnostics12040915.
Chest X-ray radiographic (CXR) imagery enables earlier and easier lung disease diagnosis. Therefore, in this paper, we propose a deep learning method using a transfer learning technique to classify lung diseases on CXR images to improve the efficiency and accuracy of computer-aided diagnostic systems' (CADs') diagnostic performance. Our proposed method is a one-step, end-to-end learning, which means that raw CXR images are directly inputted into a deep learning model (EfficientNet v2-M) to extract their meaningful features in identifying disease categories. We experimented using our proposed method on three classes of normal, pneumonia, and pneumothorax of the U.S. National Institutes of Health (NIH) data set, and achieved validation performances of loss = 0.6933, accuracy = 82.15%, sensitivity = 81.40%, and specificity = 91.65%. We also experimented on the Cheonan Soonchunhyang University Hospital (SCH) data set on four classes of normal, pneumonia, pneumothorax, and tuberculosis, and achieved validation performances of loss = 0.7658, accuracy = 82.20%, sensitivity = 81.40%, and specificity = 94.48%; testing accuracy of normal, pneumonia, pneumothorax, and tuberculosis classes was 63.60%, 82.30%, 82.80%, and 89.90%, respectively.
胸部X光射线摄影(CXR)图像能够实现更早且更轻松的肺部疾病诊断。因此,在本文中,我们提出一种使用迁移学习技术的深度学习方法,用于对CXR图像上的肺部疾病进行分类,以提高计算机辅助诊断系统(CADs)诊断性能的效率和准确性。我们提出的方法是一步式的端到端学习,这意味着原始的CXR图像被直接输入到一个深度学习模型(EfficientNet v2-M)中,以提取其在识别疾病类别方面的有意义特征。我们使用我们提出的方法对美国国立卫生研究院(NIH)数据集的正常、肺炎和气胸三类进行了实验,验证性能为损失=0.6933,准确率=82.15%,灵敏度=81.40%,特异性=91.65%。我们还在忠南顺天乡大学医院(SCH)数据集上对正常、肺炎、气胸和肺结核四类进行了实验,验证性能为损失=0.7658,准确率=82.20%,灵敏度=81.40%,特异性=94.48%;正常、肺炎、气胸和肺结核类别的测试准确率分别为63.60%、82.30%、82.80%和89.90%。