Suppr超能文献

Improving Generalization and Personalization in Model-Heterogeneous Federated Learning.

作者信息

Zhang Xiongtao, Wang Ji, Bao Weidong, Zhang Yaohong, Zhu Xiaomin, Peng Hao, Zhao Xiang

出版信息

IEEE Trans Neural Netw Learn Syst. 2025 Jan;36(1):88-101. doi: 10.1109/TNNLS.2024.3405190. Epub 2025 Jan 7.

Abstract

Conventional federated learning (FL) assumes the homogeneity of models, necessitating clients to expose their model parameters to enhance the performance of the server model. However, this assumption cannot reflect real-world scenarios. Sharing models and parameters raises security concerns for users, and solely focusing on the server-side model neglects clients' personalization requirements, potentially impeding expected performance improvements of users. On the other hand, prioritizing personalization may compromise the generalization of the server model, thereby hindering extensive knowledge migration. To address these challenges, we put forth an important problem: How can FL ensure both generalization and personalization when clients' models are heterogeneous? In this work, we introduce FedTED, which leverages a twin-branch structure and data-free knowledge distillation (DFKD) to address the challenges posed by model heterogeneity and diverse objectives in FL. The employed techniques in FedTED yield significant improvements in both personalization and generalization, while effectively coordinating the updating process of clients' heterogeneous models and successfully reconstructing a satisfactory global model. Our empirical evaluation demonstrates that FedTED outperforms many representative algorithms, particularly in scenarios where clients' models are heterogeneous, achieving a remarkable 19.37% enhancement in generalization performance and up to 9.76% improvement in personalization performance.

摘要

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验