• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过具有线性加速的局部自适应修正优化器实现高效联邦学习

Efficient Federated Learning Via Local Adaptive Amended Optimizer With Linear Speedup.

作者信息

Sun Yan, Shen Li, Sun Hao, Ding Liang, Tao Dacheng

出版信息

IEEE Trans Pattern Anal Mach Intell. 2023 Dec;45(12):14453-14464. doi: 10.1109/TPAMI.2023.3300886. Epub 2023 Nov 3.

DOI:10.1109/TPAMI.2023.3300886
PMID:37527293
Abstract

Adaptive optimization has achieved notable success for distributed learning while extending adaptive optimizer to federated Learning (FL) suffers from severe inefficiency, including (i) rugged convergence due to inaccurate gradient estimation in global adaptive optimizer; (ii) client drifts exacerbated by local over-fitting with the local adaptive optimizer. In this work, we propose a novel momentum-based algorithm via utilizing the global gradient descent and locally adaptive amended optimizer to tackle these difficulties. Specifically, we incorporate a locally amended technique to the adaptive optimizer, named Federated Local ADaptive Amended optimizer (FedLADA), which estimates the global average offset in the previous communication round and corrects the local offset through a momentum-like term to further improve the empirical training speed and mitigate the heterogeneous over-fitting. Theoretically, we establish the convergence rate of FedLADA with a linear speedup property on the non-convex case under the partial participation settings. Moreover, we conduct extensive experiments on the real-world dataset to demonstrate the efficacy of our proposed FedLADA, which could greatly reduce the communication rounds and achieves higher accuracy than several baselines.

摘要

自适应优化在分布式学习中取得了显著成功,而将自适应优化器扩展到联邦学习(FL)则效率极低,包括:(i)全局自适应优化器中梯度估计不准确导致的崎岖收敛;(ii)局部自适应优化器的局部过拟合加剧了客户端漂移。在这项工作中,我们通过利用全局梯度下降和局部自适应修正优化器,提出了一种新颖的基于动量的算法来解决这些难题。具体而言,我们将一种局部修正技术融入自适应优化器,称为联邦局部自适应修正优化器(FedLADA),它估计上一轮通信中的全局平均偏移,并通过类似动量的项校正局部偏移,以进一步提高经验训练速度并减轻异构过拟合。从理论上讲,我们在部分参与设置下的非凸情况下建立了具有线性加速特性的FedLADA收敛速率。此外,我们在真实世界数据集上进行了广泛实验,以证明我们提出的FedLADA的有效性,它可以大大减少通信轮数,并比几个基线方法实现更高的准确率。

相似文献

1
Efficient Federated Learning Via Local Adaptive Amended Optimizer With Linear Speedup.通过具有线性加速的局部自适应修正优化器实现高效联邦学习
IEEE Trans Pattern Anal Mach Intell. 2023 Dec;45(12):14453-14464. doi: 10.1109/TPAMI.2023.3300886. Epub 2023 Nov 3.
2
Stabilizing and Accelerating Federated Learning on Heterogeneous Data With Partial Client Participation.通过部分客户端参与在异构数据上稳定并加速联邦学习
IEEE Trans Pattern Anal Mach Intell. 2025 Jan;47(1):67-83. doi: 10.1109/TPAMI.2024.3469188. Epub 2024 Dec 4.
3
AdaSAM: Boosting sharpness-aware minimization with adaptive learning rate and momentum for training deep neural networks.AdaSAM:通过自适应学习率和动量增强锐度感知最小化以训练深度神经网络
Neural Netw. 2024 Jan;169:506-519. doi: 10.1016/j.neunet.2023.10.044. Epub 2023 Nov 1.
4
Lazily Aggregated Quantized Gradient Innovation for Communication-Efficient Federated Learning.用于通信高效联邦学习的懒惰聚合量化梯度创新
IEEE Trans Pattern Anal Mach Intell. 2022 Apr;44(4):2031-2044. doi: 10.1109/TPAMI.2020.3033286. Epub 2022 Mar 4.
5
FedADT: An Adaptive Method Based on Derivative Term for Federated Learning.FedADT:一种基于导数项的联邦学习自适应方法。
Sensors (Basel). 2023 Jun 29;23(13):6034. doi: 10.3390/s23136034.
6
Federated learning with workload-aware client scheduling in heterogeneous systems.异构系统中具有工作负载感知的客户端调度的联邦学习。
Neural Netw. 2022 Oct;154:560-573. doi: 10.1016/j.neunet.2022.07.030. Epub 2022 Aug 1.
7
A novel adaptive cubic quasi-Newton optimizer for deep learning based medical image analysis tasks, validated on detection of COVID-19 and segmentation for COVID-19 lung infection, liver tumor, and optic disc/cup.一种用于深度学习的新型自适应三次拟牛顿优化器,在 COVID-19 检测和 COVID-19 肺部感染、肝脏肿瘤以及视盘/杯分割等医学图像分析任务中得到验证。
Med Phys. 2023 Mar;50(3):1528-1538. doi: 10.1002/mp.15969. Epub 2022 Oct 6.
8
An EMD-Based Adaptive Client Selection Algorithm for Federated Learning in Heterogeneous Data Scenarios.一种基于经验模态分解的异构数据场景下联邦学习自适应客户端选择算法
Front Plant Sci. 2022 Jun 9;13:908814. doi: 10.3389/fpls.2022.908814. eCollection 2022.
9
FedDdrl: Federated Double Deep Reinforcement Learning for Heterogeneous IoT with Adaptive Early Client Termination and Local Epoch Adjustment.FedDdrl:具有自适应早期客户端终止和本地 epoch 调整的异构物联网联邦双深度强化学习。
Sensors (Basel). 2023 Feb 23;23(5):2494. doi: 10.3390/s23052494.
10
FedLGA: Toward System-Heterogeneity of Federated Learning via Local Gradient Approximation.FedLGA:通过局部梯度近似实现联邦学习的系统异构性
IEEE Trans Cybern. 2024 Jan;54(1):401-414. doi: 10.1109/TCYB.2023.3247365. Epub 2023 Dec 20.