• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

FedART:一种融合联邦学习与自适应共振理论的神经模型。

FedART: A neural model integrating federated learning and adaptive resonance theory.

作者信息

Pateria Shubham, Subagdja Budhitama, Tan Ah-Hwee

机构信息

School of Computing and Information Systems, Singapore Management University, Singapore.

School of Computing and Information Systems, Singapore Management University, Singapore.

出版信息

Neural Netw. 2025 Jan;181:106845. doi: 10.1016/j.neunet.2024.106845. Epub 2024 Nov 4.

DOI:10.1016/j.neunet.2024.106845
PMID:39536601
Abstract

Federated Learning (FL) has emerged as a promising paradigm for collaborative model training across distributed clients while preserving data privacy. However, prevailing FL approaches aggregate the clients' local models into a global model through multi-round iterative parameter averaging. This leads to the undesirable bias of the aggregated model towards certain clients in the presence of heterogeneous data distributions among the clients. Moreover, such approaches are restricted to supervised classification tasks and do not support unsupervised clustering. To address these limitations, we propose a novel one-shot FL approach called Federated Adaptive Resonance Theory (FedART) which leverages self-organizing Adaptive Resonance Theory (ART) models to learn category codes, where each code represents a cluster of similar data samples. In FedART, the clients learn to associate their private data with various local category codes. Under heterogeneity, the local codes across different clients represent heterogeneous data. In turn, a global model takes these local codes as inputs and aggregates them into global category codes, wherein heterogeneous client data is indirectly represented by distinctly encoded global codes, in contrast to the averaging out of parameters in the existing approaches. This enables the learned global model to handle heterogeneous data. In addition, FedART employs a universal learning mechanism to support both federated classification and clustering tasks. Our experiments conducted on various federated classification and clustering tasks show that FedART consistently outperforms state-of-the-art FL methods on data with heterogeneous distribution across clients.

摘要

联邦学习(FL)已成为一种很有前景的范式,可用于在保护数据隐私的同时跨分布式客户端进行协作模型训练。然而,现有的联邦学习方法通过多轮迭代参数平均将客户端的本地模型聚合为全局模型。在客户端之间存在异构数据分布的情况下,这会导致聚合模型对某些客户端产生不良偏差。此外,此类方法仅限于监督分类任务,不支持无监督聚类。为了解决这些局限性,我们提出了一种新颖的一次性联邦学习方法,称为联邦自适应共振理论(FedART),它利用自组织自适应共振理论(ART)模型来学习类别代码,其中每个代码代表一组相似的数据样本。在FedART中,客户端学习将其私有数据与各种本地类别代码相关联。在异构性情况下,不同客户端的本地代码代表异构数据。反过来,全局模型将这些本地代码作为输入并将它们聚合为全局类别代码,与现有方法中参数的平均化不同,异构客户端数据由经过不同编码的全局代码间接表示。这使得学习到的全局模型能够处理异构数据。此外,FedART采用通用学习机制来支持联邦分类和聚类任务。我们在各种联邦分类和聚类任务上进行的实验表明,在客户端之间具有异构分布的数据上,FedART始终优于当前最先进的联邦学习方法。

相似文献

1
FedART: A neural model integrating federated learning and adaptive resonance theory.FedART:一种融合联邦学习与自适应共振理论的神经模型。
Neural Netw. 2025 Jan;181:106845. doi: 10.1016/j.neunet.2024.106845. Epub 2024 Nov 4.
2
Contrastive encoder pre-training-based clustered federated learning for heterogeneous data.基于对比编码器预训练的聚类联邦学习用于异构数据。
Neural Netw. 2023 Aug;165:689-704. doi: 10.1016/j.neunet.2023.06.010. Epub 2023 Jun 10.
3
FedADMM-InSa: An inexact and self-adaptive ADMM for federated learning.FedADMM-InSa:一种用于联邦学习的不精确自适应交替方向乘子法
Neural Netw. 2025 Jan;181:106772. doi: 10.1016/j.neunet.2024.106772. Epub 2024 Oct 1.
4
A Cluster-Driven Adaptive Training Approach for Federated Learning.一种基于簇的联邦学习自适应训练方法。
Sensors (Basel). 2022 Sep 18;22(18):7061. doi: 10.3390/s22187061.
5
StoCFL: A stochastically clustered federated learning framework for Non-IID data with dynamic client participation.StoCFL:一种用于具有动态客户端参与的非独立同分布数据的随机聚类联邦学习框架。
Neural Netw. 2025 Jul;187:107278. doi: 10.1016/j.neunet.2025.107278. Epub 2025 Feb 22.
6
Differentially Private Client Selection and Resource Allocation in Federated Learning for Medical Applications Using Graph Neural Networks.基于图神经网络的医疗联邦学习中差异化隐私客户端选择与资源分配
Sensors (Basel). 2024 Aug 8;24(16):5142. doi: 10.3390/s24165142.
7
FedBM: Stealing knowledge from pre-trained language models for heterogeneous federated learning.联邦基于知识迁移的模型:从预训练语言模型中窃取知识用于异构联邦学习。
Med Image Anal. 2025 May;102:103524. doi: 10.1016/j.media.2025.103524. Epub 2025 Mar 7.
8
Learn the global prompt in the low-rank tensor space for heterogeneous federated learning.在低秩张量空间中学习用于异构联邦学习的全局提示。
Neural Netw. 2025 Jul;187:107319. doi: 10.1016/j.neunet.2025.107319. Epub 2025 Mar 5.
9
A robust and personalized privacy-preserving approach for adaptive clustered federated distillation.一种用于自适应聚类联邦蒸馏的强大且个性化的隐私保护方法。
Sci Rep. 2025 Apr 23;15(1):14069. doi: 10.1038/s41598-025-96468-8.
10
PGFed: Personalize Each Client's Global Objective for Federated Learning.PGFed:为联邦学习个性化每个客户端的全局目标。
Proc IEEE Int Conf Comput Vis. 2023 Oct;2023:3923-3933. doi: 10.1109/iccv51070.2023.00365. Epub 2024 Jan 15.