Suppr超能文献

用于通信高效联邦学习的懒惰聚合量化梯度创新

Lazily Aggregated Quantized Gradient Innovation for Communication-Efficient Federated Learning.

作者信息

Sun Jun, Chen Tianyi, Giannakis Georgios B, Yang Qinmin, Yang Zaiyue

出版信息

IEEE Trans Pattern Anal Mach Intell. 2022 Apr;44(4):2031-2044. doi: 10.1109/TPAMI.2020.3033286. Epub 2022 Mar 4.

Abstract

This paper focuses on communication-efficient federated learning problem, and develops a novel distributed quantized gradient approach, which is characterized by adaptive communications of the quantized gradients. Specifically, the federated learning builds upon the server-worker infrastructure, where the workers calculate local gradients and upload them to the server; then the server obtain the global gradient by aggregating all the local gradients and utilizes it to update the model parameter. The key idea to save communications from the worker to the server is to quantize gradients as well as skip less informative quantized gradient communications by reusing previous gradients. Quantizing and skipping result in 'lazy' worker-server communications, which justifies the term Lazily Aggregated Quantized (LAQ) gradient. Theoretically, the LAQ algorithm achieves the same linear convergence as the gradient descent in the strongly convex case, while effecting major savings in the communication in terms of transmitted bits and communication rounds. Empirically, extensive experiments using realistic data corroborate a significant communication reduction compared with state-of-the-art gradient- and stochastic gradient-based algorithms.

摘要

本文聚焦于通信高效的联邦学习问题,并开发了一种新颖的分布式量化梯度方法,其特点是对量化梯度进行自适应通信。具体而言,联邦学习基于服务器-工作节点架构构建,其中工作节点计算局部梯度并将其上传至服务器;然后服务器通过聚合所有局部梯度来获得全局梯度,并利用全局梯度更新模型参数。减少从工作节点到服务器通信量的关键思想是对梯度进行量化,并通过重用先前的梯度来跳过信息量较少的量化梯度通信。量化和跳过导致工作节点-服务器之间的通信变得“懒惰”,这也解释了“懒惰聚合量化(LAQ)梯度”这一术语。从理论上讲,LAQ算法在强凸情况下实现了与梯度下降相同的线性收敛,同时在传输比特数和通信轮数方面大幅节省了通信量。从实验上看,使用实际数据进行的大量实验证实,与基于梯度和随机梯度的现有算法相比,通信量显著减少。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验