• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用模型投影的联邦学习用于非独立同分布数据的多中心疾病诊断

Federated learning using model projection for multi-center disease diagnosis with non-IID data.

作者信息

Du Jie, Li Wei, Liu Peng, Vong Chi-Man, You Yongke, Lei Baiying, Wang Tianfu

机构信息

National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, Guangdong Key Laboratory for Biomedical Measurements and Ultrasound Imaging, School of Biomedical Engineering, Medical School, Shenzhen University, Shenzhen 518060, Guangdong, China.

Artificial Intelligence Industrial Innovation Research Center, Shenzhen Institute for Advanced Study, University of Electronic Science and Technology of China, Shenzhen, 518110, China.

出版信息

Neural Netw. 2024 Oct;178:106409. doi: 10.1016/j.neunet.2024.106409. Epub 2024 May 24.

DOI:10.1016/j.neunet.2024.106409
PMID:38823069
Abstract

Multi-center disease diagnosis aims to build a global model for all involved medical centers. Due to privacy concerns, it is infeasible to collect data from multiple centers for training (i.e., centralized learning). Federated Learning (FL) is a decentralized framework that enables multiple clients (e.g., medical centers) to collaboratively train a global model while retaining patient data locally for privacy. However, in practice, the data across medical centers are not independently and identically distributed (Non-IID), causing two challenging issues: (1) catastrophic forgetting at clients, i.e., the local model at clients will forget the knowledge received from the global model after local training, causing reduced performance; and (2) invalid aggregation at the server, i.e., the global model at the server may not be favorable to some clients after model aggregation, resulting in a slow convergence rate. To mitigate these issues, an innovative Federated learning using Model Projection (FedMoP) is proposed, which guarantees: (1) the loss of local model on global data does not increase after local training without accessing the global data so that the performance will not be degenerated; and (2) the loss of global model on local data does not increase after aggregation without accessing local data so that convergence rate can be improved. Extensive experimental results show that our FedMoP outperforms state-of-the-art FL methods in terms of accuracy, convergence rate and communication cost. In particular, our FedMoP also achieves comparable or even higher accuracy than centralized learning. Thus, our FedMoP can ensure privacy protection while outperforming centralized learning in accuracy and communication cost.

摘要

多中心疾病诊断旨在为所有参与的医疗中心构建一个全局模型。由于隐私问题,从多个中心收集数据进行训练(即集中式学习)是不可行的。联邦学习(FL)是一种去中心化框架,它使多个客户端(例如医疗中心)能够协作训练全局模型,同时将患者数据本地保留以保护隐私。然而,在实际中,医疗中心之间的数据并非独立同分布(非IID),这导致了两个具有挑战性的问题:(1)客户端的灾难性遗忘,即客户端的本地模型在本地训练后会忘记从全局模型获得的知识,导致性能下降;(2)服务器端的无效聚合,即服务器端的全局模型在模型聚合后可能对某些客户端不利,导致收敛速度缓慢。为了缓解这些问题,提出了一种创新的使用模型投影的联邦学习(FedMoP),它保证:(1)在不访问全局数据的情况下,本地模型在全局数据上的损失在本地训练后不会增加,从而性能不会退化;(2)在不访问本地数据的情况下,全局模型在本地数据上的损失在聚合后不会增加,从而可以提高收敛速度。大量实验结果表明,我们的FedMoP在准确性、收敛速度和通信成本方面优于现有最先进的联邦学习方法。特别是,我们的FedMoP在准确性方面也达到了与集中式学习相当甚至更高的水平。因此,我们的FedMoP可以确保隐私保护,同时在准确性和通信成本方面优于集中式学习。

相似文献

1
Federated learning using model projection for multi-center disease diagnosis with non-IID data.使用模型投影的联邦学习用于非独立同分布数据的多中心疾病诊断
Neural Netw. 2024 Oct;178:106409. doi: 10.1016/j.neunet.2024.106409. Epub 2024 May 24.
2
An Optimization Method for Non-IID Federated Learning Based on Deep Reinforcement Learning.一种基于深度强化学习的非独立同分布联邦学习优化方法。
Sensors (Basel). 2023 Nov 16;23(22):9226. doi: 10.3390/s23229226.
3
Secure and decentralized federated learning framework with non-IID data based on blockchain.基于区块链的具有非独立同分布数据的安全且去中心化联邦学习框架。
Heliyon. 2024 Feb 29;10(5):e27176. doi: 10.1016/j.heliyon.2024.e27176. eCollection 2024 Mar 15.
4
FedMCC: Federated multi-center clustering algorithm to improve privacy healthcare.FedMCC:联邦多中心聚类算法,提高医疗保健隐私性。
Methods. 2023 Oct;218:94-100. doi: 10.1016/j.ymeth.2023.07.006. Epub 2023 Jul 26.
5
FEDSLD: FEDERATED LEARNING WITH SHARED LABEL DISTRIBUTION FOR MEDICAL IMAGE CLASSIFICATION.FEDSLD:用于医学图像分类的具有共享标签分布的联邦学习
Proc IEEE Int Symp Biomed Imaging. 2022 Mar;2022. doi: 10.1109/isbi52829.2022.9761404. Epub 2022 Apr 26.
6
Data-free knowledge distillation via generator-free data generation for Non-IID federated learning.通过无生成器的数据生成实现非独立同分布联邦学习的数据自由知识蒸馏。
Neural Netw. 2024 Nov;179:106627. doi: 10.1016/j.neunet.2024.106627. Epub 2024 Aug 10.
7
Distributed Detection of Malicious Android Apps While Preserving Privacy Using Federated Learning.基于联邦学习的恶意安卓应用隐私保护分布式检测。
Sensors (Basel). 2023 Feb 15;23(4):2198. doi: 10.3390/s23042198.
8
FedRAD: Heterogeneous Federated Learning via Relational Adaptive Distillation.FedRAD:通过关系自适应蒸馏实现的异构联邦学习
Sensors (Basel). 2023 Jul 19;23(14):6518. doi: 10.3390/s23146518.
9
Adapt to Adaptation: Learning Personalization for Cross-Silo Federated Learning.适应适应性:跨孤岛联邦学习的个性化学习
IJCAI (U S). 2022 Jul;2022:2166-2173. doi: 10.24963/ijcai.2022/301.
10
Federated Learning With Taskonomy for Non-IID Data.用于非独立同分布数据的基于任务onomy的联邦学习
IEEE Trans Neural Netw Learn Syst. 2022 Mar 22;PP. doi: 10.1109/TNNLS.2022.3152581.