Department of Information Technology, R.M.K Engineering College, Chennai, Tamil Nadu, India.
Department of Computer Science and Engineering, Rajalakshmi Engineering College, Chennai, Tamil Nadu, India.
J Imaging Inform Med. 2024 Aug;37(4):1488-1504. doi: 10.1007/s10278-024-01035-8. Epub 2024 Feb 29.
Breast cancer is deadly cancer causing a considerable number of fatalities among women in worldwide. To enhance patient outcomes as well as survival rates, early and accurate detection is crucial. Machine learning techniques, particularly deep learning, have demonstrated impressive success in various image recognition tasks, including breast cancer classification. However, the reliance on large labeled datasets poses challenges in the medical domain due to privacy issues and data silos. This study proposes a novel transfer learning approach integrated into a federated learning framework to solve the limitations of limited labeled data and data privacy in collaborative healthcare settings. For breast cancer classification, the mammography and MRO images were gathered from three different medical centers. Federated learning, an emerging privacy-preserving paradigm, empowers multiple medical institutions to jointly train the global model while maintaining data decentralization. Our proposed methodology capitalizes on the power of pre-trained ResNet, a deep neural network architecture, as a feature extractor. By fine-tuning the higher layers of ResNet using breast cancer datasets from diverse medical centers, we enable the model to learn specialized features relevant to different domains while leveraging the comprehensive image representations acquired from large-scale datasets like ImageNet. To overcome domain shift challenges caused by variations in data distributions across medical centers, we introduce domain adversarial training. The model learns to minimize the domain discrepancy while maximizing classification accuracy, facilitating the acquisition of domain-invariant features. We conducted extensive experiments on diverse breast cancer datasets obtained from multiple medical centers. Comparative analysis was performed to evaluate the proposed approach against traditional standalone training and federated learning without domain adaptation. When compared with traditional models, our proposed model showed a classification accuracy of 98.8% and a computational time of 12.22 s. The results showcase promising enhancements in classification accuracy and model generalization, underscoring the potential of our method in improving breast cancer classification performance while upholding data privacy in a federated healthcare environment.
乳腺癌是一种致命的癌症,在全球范围内导致了相当数量的女性死亡。为了提高患者的治疗效果和生存率,早期和准确的检测至关重要。机器学习技术,特别是深度学习,在各种图像识别任务中取得了令人瞩目的成功,包括乳腺癌分类。然而,由于隐私问题和数据孤岛,在医疗领域中,对大量标记数据集的依赖带来了挑战。本研究提出了一种新的迁移学习方法,集成到联邦学习框架中,以解决协作医疗环境中有限标记数据和数据隐私的限制。对于乳腺癌分类,从三个不同的医疗中心收集了乳房 X 光照片和 MRO 图像。联邦学习是一种新兴的隐私保护范例,使多个医疗机构能够在保持数据去中心化的同时共同训练全局模型。我们提出的方法利用预先训练的 ResNet 的强大功能,ResNet 是一种深度神经网络架构,作为特征提取器。通过使用来自不同医疗中心的乳腺癌数据集微调 ResNet 的高层,我们使模型能够学习与不同领域相关的专业特征,同时利用来自像 ImageNet 这样的大规模数据集获取的全面图像表示。为了克服不同医疗中心数据分布差异引起的领域转移挑战,我们引入了域对抗训练。模型学习最小化域差异,同时最大化分类准确性,从而促进获取域不变特征。我们在从多个医疗中心获得的各种乳腺癌数据集上进行了广泛的实验。对传统的独立训练和无域自适应的联邦学习进行了对比分析,以评估所提出的方法。与传统模型相比,我们提出的模型的分类准确率达到了 98.8%,计算时间为 12.22 秒。结果显示了在分类准确性和模型泛化方面的显著增强,突出了我们的方法在保持联邦医疗环境中数据隐私的同时提高乳腺癌分类性能的潜力。
J Imaging Inform Med. 2024-8
IEEE Trans Neural Netw Learn Syst. 2024-6
Comput Methods Programs Biomed. 2023-2
Sensors (Basel). 2022-1-24
J Cancer Res Clin Oncol. 2023-11
Sci Rep. 2025-7-1
Bioengineering (Basel). 2025-6-13
IEEE J Biomed Health Inform. 2022-10