• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

FedADT:一种基于导数项的联邦学习自适应方法。

FedADT: An Adaptive Method Based on Derivative Term for Federated Learning.

作者信息

Gao Huimin, Wu Qingtao, Zhao Xuhui, Zhu Junlong, Zhang Mingchuan

机构信息

School of Information Engineering, Henan University of Science and Technology, Luoyang 471023, China.

Intelligent System Science and Technology Innovation Center, Longmen Laboratory, Luoyang 471023, China.

出版信息

Sensors (Basel). 2023 Jun 29;23(13):6034. doi: 10.3390/s23136034.

DOI:10.3390/s23136034
PMID:37447882
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC10347066/
Abstract

Federated learning is served as a novel distributed training framework that enables multiple clients of the internet of things to collaboratively train a global model while the data remains local. However, the implement of federated learning faces many problems in practice, such as the large number of training for convergence due to the size of model and the lack of adaptivity by the stochastic gradient-based update at the client side. Meanwhile, it is sensitive to noise during the optimization process that can affect the performance of the final model. For these reasons, we propose Federated Adaptive learning based on Derivative Term, called FedADT in this paper, which incorporates adaptive step size and difference of gradient in the update of local model. To further reduce the influence of noise on the derivative term that is estimated by difference of gradient, we use moving average decay on the derivative term. Moreover, we analyze the convergence performance of the proposed algorithm for non-convex objective function, i.e., the convergence rate of 1/nT can be achieved by choosing appropriate hyper-parameters, where is the number of clients and is the number of iterations, respectively. Finally, various experiments for the image classification task are conducted by training widely used convolutional neural network on MNIST and Fashion MNIST datasets to verify the effectiveness of FedADT. In addition, the receiver operating characteristic curve is used to display the result of the proposed algorithm by predicting the categories of clothing on the Fashion MNIST dataset.

摘要

联邦学习是一种新型的分布式训练框架,它能使多个物联网客户端在数据保持本地化的同时协作训练一个全局模型。然而,联邦学习的实现实际上面临许多问题,例如由于模型规模导致收敛所需的大量训练,以及客户端基于随机梯度的更新缺乏适应性。同时,它在优化过程中对噪声敏感,这会影响最终模型的性能。基于这些原因,我们提出了基于导数项的联邦自适应学习,在本文中称为FedADT,它在局部模型更新中纳入了自适应步长和梯度差。为了进一步减少噪声对由梯度差估计的导数项的影响,我们对导数项使用移动平均衰减。此外,我们分析了所提出算法对于非凸目标函数的收敛性能,即通过选择合适的超参数可以实现1/nT的收敛速率,其中n是客户端数量,T是迭代次数。最后,通过在MNIST和Fashion MNIST数据集上训练广泛使用的卷积神经网络来进行图像分类任务的各种实验,以验证FedADT的有效性。此外,通过在Fashion MNIST数据集上预测服装类别,使用接收者操作特征曲线来展示所提出算法的结果。

相似文献

1
FedADT: An Adaptive Method Based on Derivative Term for Federated Learning.FedADT:一种基于导数项的联邦学习自适应方法。
Sensors (Basel). 2023 Jun 29;23(13):6034. doi: 10.3390/s23136034.
2
An EMD-Based Adaptive Client Selection Algorithm for Federated Learning in Heterogeneous Data Scenarios.一种基于经验模态分解的异构数据场景下联邦学习自适应客户端选择算法
Front Plant Sci. 2022 Jun 9;13:908814. doi: 10.3389/fpls.2022.908814. eCollection 2022.
3
Performance Enhancement in Federated Learning by Reducing Class Imbalance of Non-IID Data.通过减少非独立同分布数据的类别不平衡来提高联邦学习中的性能提升。
Sensors (Basel). 2023 Jan 19;23(3):1152. doi: 10.3390/s23031152.
4
Secure and decentralized federated learning framework with non-IID data based on blockchain.基于区块链的具有非独立同分布数据的安全且去中心化联邦学习框架。
Heliyon. 2024 Feb 29;10(5):e27176. doi: 10.1016/j.heliyon.2024.e27176. eCollection 2024 Mar 15.
5
Stochastic Channel-Based Federated Learning With Neural Network Pruning for Medical Data Privacy Preservation: Model Development and Experimental Validation.基于随机信道的联合学习与神经网络剪枝以保护医学数据隐私:模型开发与实验验证
JMIR Form Res. 2020 Dec 22;4(12):e17265. doi: 10.2196/17265.
6
Boosted federated learning based on improved Particle Swarm Optimization for healthcare IoT devices.基于改进粒子群优化算法的联邦学习在医疗保健物联网设备中的应用。
Comput Biol Med. 2023 Sep;163:107195. doi: 10.1016/j.compbiomed.2023.107195. Epub 2023 Jun 22.
7
A distribution information sharing federated learning approach for medical image data.一种用于医学图像数据的分布式信息共享联邦学习方法。
Complex Intell Systems. 2023 Mar 29:1-12. doi: 10.1007/s40747-023-01035-1.
8
Bidirectional Decoupled Distillation for Heterogeneous Federated Learning.用于异构联邦学习的双向解耦蒸馏
Entropy (Basel). 2024 Sep 5;26(9):762. doi: 10.3390/e26090762.
9
An Optimization Method for Non-IID Federated Learning Based on Deep Reinforcement Learning.一种基于深度强化学习的非独立同分布联邦学习优化方法。
Sensors (Basel). 2023 Nov 16;23(22):9226. doi: 10.3390/s23229226.
10
Efficient Federated Learning Via Local Adaptive Amended Optimizer With Linear Speedup.通过具有线性加速的局部自适应修正优化器实现高效联邦学习
IEEE Trans Pattern Anal Mach Intell. 2023 Dec;45(12):14453-14464. doi: 10.1109/TPAMI.2023.3300886. Epub 2023 Nov 3.

本文引用的文献

1
Efficient Gradient Updating Strategies with Adaptive Power Allocation for Federated Learning over Wireless Backhaul.基于无线回程的联邦学习中具有自适应功率分配的高效梯度更新策略
Sensors (Basel). 2021 Oct 13;21(20):6791. doi: 10.3390/s21206791.
2
PID Controller-Guided Attention Neural Network Learning for Fast and Effective Real Photographs Denoising.基于PID控制器引导注意力的神经网络学习用于快速有效地对真实照片进行去噪
IEEE Trans Neural Netw Learn Syst. 2022 Jul;33(7):3010-3023. doi: 10.1109/TNNLS.2020.3048031. Epub 2022 Jul 6.
3
Federated Learning for Healthcare Informatics.
医疗信息学中的联邦学习
J Healthc Inform Res. 2021;5(1):1-19. doi: 10.1007/s41666-020-00082-4. Epub 2020 Nov 12.