Department of Bioengineering, University of California Riverside, Riverside, CA, 92521, USA.
School of Mechanical and Manufacturing Engineering (SMME), National University of Science and Technology (NUST), Islamabad, Pakistan.
Sci Rep. 2024 Oct 28;14(1):25809. doi: 10.1038/s41598-024-76512-9.
Computer-assisted diagnosis (CAD) plays a key role in cancer diagnosis or screening. Whereas, current CAD performs poorly on whole slide image (WSI) analysis, and thus fails to generalize well. This research aims to develop an automatic classification system to distinguish between different types of carcinomas. Obtaining rich deep features in multi-class classification while achieving high accuracy is still a challenging problem. The detection and classification of cancerous cells in WSI are quite challenging due to the misclassification of normal lumps and cancerous cells. This is due to cluttering, occlusion, and irregular cell distribution. Researchers in the past mostly obtained the hand-crafted features while neglecting the above-mentioned challenges which led to a reduction of the classification accuracy. To mitigate this problem we proposed an efficient dual attention-based network (CytoNet). The proposed network is composed of two main modules (i) Efficient-Net and (ii) Dual Attention Module (DAM). Efficient-Net is capable of obtaining higher accuracy and enhancing efficiency as compared to existing Convolutional Neural Networks (CNNs). It is also useful to obtain the most generic features as it has been trained on ImageNet. Whereas DAM is very robust in obtaining attention and targeted features while negating the background. In this way, the combination of an efficient and attention module is useful to obtain the robust, and intrinsic features to obtain comparable performance. Further, we evaluated the proposed network on two well-known datasets (i) Our generated thyroid dataset (ii) Mendeley Cervical dataset (Hussain in Data Brief, 2019) with enhanced performance compared to their counterparts. CytoNet demonstrated a 99% accuracy rate on the thyroid dataset in comparison to its counterpart. The precision, recall, and F1-score values achieved on the Mendeley Cervical dataset are 0.992, 0.985, and 0.977, respectively. The code implementation is available on GitHub. https://github.com/naveedilyas/CytoNet-An-Efficient-Dual-Attention-based-Automatic-Prediction-of-Cancer-Sub-types-in-Cytol.
计算机辅助诊断(CAD)在癌症诊断或筛查中起着关键作用。然而,目前的 CAD 在全切片图像(WSI)分析方面表现不佳,因此无法很好地推广。本研究旨在开发一种自动分类系统,以区分不同类型的癌。在多类分类中获得丰富的深度特征并实现高精度仍然是一个具有挑战性的问题。由于正常肿块和癌细胞的误分类,在 WSI 中检测和分类癌细胞极具挑战性。这是由于混乱、遮挡和不规则的细胞分布造成的。过去的研究人员大多获取手工制作的特征,而忽略了上述挑战,这导致分类精度降低。为了解决这个问题,我们提出了一种有效的双注意力网络(CytoNet)。所提出的网络由两个主要模块组成:(i)Efficient-Net 和(ii)双注意力模块(DAM)。Efficient-Net 能够比现有的卷积神经网络(CNN)获得更高的准确性和效率。它还可以用于获取最通用的特征,因为它是在 ImageNet 上训练的。而 DAM 非常擅长获取注意力和有针对性的特征,同时否定背景。这样,高效和注意力模块的结合有助于获得稳健的内在特征,以获得可比的性能。此外,我们在两个著名的数据集(i)我们生成的甲状腺数据集和(ii)Mendeley Cervical 数据集上评估了所提出的网络,与对照组相比,性能得到了增强。CytoNet 在甲状腺数据集上的准确率达到 99%,优于对照组。在 Mendeley Cervical 数据集上的精度、召回率和 F1 分数分别为 0.992、0.985 和 0.977。代码实现可在 GitHub 上获得。https://github.com/naveedilyas/CytoNet-An-Efficient-Dual-Attention-based-Automatic-Prediction-of-Cancer-Sub-types-in-Cytol。
Nat Commun. 2021-9-24
Comput Methods Programs Biomed. 2023-10
Comput Methods Programs Biomed. 2024-8
Comput Methods Programs Biomed. 2024-7
Comput Methods Programs Biomed. 2023-10
Med Devices (Auckl). 2022-6-16
Phys Med Biol. 2021-11-18
J Pathol Inform. 2021-6-9