Suppr超能文献

通用异构联邦交叉相关性和实例相似性学习

Generalizable Heterogeneous Federated Cross-Correlation and Instance Similarity Learning.

作者信息

Huang Wenke, Ye Mang, Shi Zekun, Du Bo

出版信息

IEEE Trans Pattern Anal Mach Intell. 2023 Oct 25;PP. doi: 10.1109/TPAMI.2023.3327373.

Abstract

Federated learning is an important privacy-preserving multi-party learning paradigm, involving collaborative learning with others and local updating on private data. Model heterogeneity and catastrophic forgetting are two crucial challenges, which greatly limit the applicability and generalizability. This paper presents a novel FCCL+, federated correlation and similarity learning with non-target distillation, facilitating the both intra-domain discriminability and inter-domain generalization. For heterogeneity issue, we leverage irrelevant unlabeled public data for communication between the heterogeneous participants. We construct cross-correlation matrix and align instance similarity distribution on both logits and feature levels, which effectively overcomes the communication barrier and improves the generalizable ability. For catastrophic forgetting in local updating stage, FCCL+ introduces Federated Non Target Distillation, which retains inter-domain knowledge while avoiding the optimization conflict issue, fulling distilling privileged inter-domain information through depicting posterior classes relation. Considering that there is no standard benchmark for evaluating existing heterogeneous federated learning under the same setting, we present a comprehensive benchmark with extensive representative methods under four domain shift scenarios, supporting both heterogeneous and homogeneous federated settings. Empirical results demonstrate the superiority of our method and the efficiency of modules on various scenarios. The benchmark code for reproducing our results is available at https://github.com/WenkeHuang/FCCL.

摘要

联邦学习是一种重要的隐私保护多方学习范式,涉及与他人的协作学习以及对私有数据的本地更新。模型异质性和灾难性遗忘是两个关键挑战,极大地限制了其适用性和泛化能力。本文提出了一种新颖的FCCL+,即带非目标蒸馏的联邦相关性和相似性学习,促进了域内可辨别性和跨域泛化能力。对于异质性问题,我们利用无关的未标记公共数据在异构参与者之间进行通信。我们构建互相关矩阵,并在对数its和特征层面上对齐实例相似性分布,有效克服了通信障碍并提高了泛化能力。对于本地更新阶段的灾难性遗忘,FCCL+引入了联邦非目标蒸馏,它在避免优化冲突问题的同时保留跨域知识,通过描述后验类关系充分蒸馏特权跨域信息。考虑到在相同设置下没有评估现有异构联邦学习的标准基准,我们提出了一个综合基准,在四种域转移场景下包含广泛的代表性方法,支持异构和同构联邦设置。实证结果证明了我们方法的优越性以及各模块在各种场景下的效率。用于重现我们结果的基准代码可在https://github.com/WenkeHuang/FCCL获取。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验