• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

模型剪枝助力边缘设备实现高效联邦学习。

Model Pruning Enables Efficient Federated Learning on Edge Devices.

作者信息

Jiang Yuang, Wang Shiqiang, Valls Victor, Ko Bong Jun, Lee Wei-Han, Leung Kin K, Tassiulas Leandros

出版信息

IEEE Trans Neural Netw Learn Syst. 2023 Dec;34(12):10374-10386. doi: 10.1109/TNNLS.2022.3166101. Epub 2023 Nov 30.

DOI:10.1109/TNNLS.2022.3166101
PMID:35468066
Abstract

Federated learning (FL) allows model training from local data collected by edge/mobile devices while preserving data privacy, which has wide applicability to image and vision applications. A challenge is that client devices in FL usually have much more limited computation and communication resources compared to servers in a data center. To overcome this challenge, we propose PruneFL -a novel FL approach with adaptive and distributed parameter pruning, which adapts the model size during FL to reduce both communication and computation overhead and minimize the overall training time, while maintaining a similar accuracy as the original model. PruneFL includes initial pruning at a selected client and further pruning as part of the FL process. The model size is adapted during this process, which includes maximizing the approximate empirical risk reduction divided by the time of one FL round. Our experiments with various datasets on edge devices (e.g., Raspberry Pi) show that: 1) we significantly reduce the training time compared to conventional FL and various other pruning-based methods and 2) the pruned model with automatically determined size converges to an accuracy that is very similar to the original model, and it is also a lottery ticket of the original model.

摘要

联邦学习(FL)允许在保护数据隐私的同时,根据边缘/移动设备收集的本地数据进行模型训练,这在图像和视觉应用中具有广泛的适用性。一个挑战是,与数据中心的服务器相比,联邦学习中的客户端设备通常具有更加有限的计算和通信资源。为了克服这一挑战,我们提出了PruneFL——一种具有自适应和分布式参数剪枝的新型联邦学习方法,它在联邦学习过程中调整模型大小,以减少通信和计算开销,并将整体训练时间降至最低,同时保持与原始模型相似的准确率。PruneFL包括在选定客户端进行初始剪枝,并在联邦学习过程中进行进一步剪枝。在此过程中调整模型大小,这包括最大化近似经验风险降低量除以一轮联邦学习的时间。我们在边缘设备(如树莓派)上使用各种数据集进行的实验表明:1)与传统联邦学习和其他各种基于剪枝的方法相比,我们显著减少了训练时间;2)自动确定大小的剪枝模型收敛到与原始模型非常相似的准确率,并且它也是原始模型的一张“中奖彩票”。

相似文献

1
Model Pruning Enables Efficient Federated Learning on Edge Devices.模型剪枝助力边缘设备实现高效联邦学习。
IEEE Trans Neural Netw Learn Syst. 2023 Dec;34(12):10374-10386. doi: 10.1109/TNNLS.2022.3166101. Epub 2023 Nov 30.
2
A Communication-Efficient, Privacy-Preserving Federated Learning Algorithm Based on Two-Stage Gradient Pruning and Differentiated Differential Privacy.一种基于两阶段梯度剪枝和差异化差分隐私的通信高效、隐私保护联邦学习算法。
Sensors (Basel). 2023 Nov 21;23(23):9305. doi: 10.3390/s23239305.
3
Communication-efficient federated learning.高效通信的联邦学习。
Proc Natl Acad Sci U S A. 2021 Apr 27;118(17). doi: 10.1073/pnas.2024789118.
4
OnDev-LCT: On-Device Lightweight Convolutional Transformers towards federated learning.OnDev-LCT:面向联邦学习的设备端轻量级卷积变压器
Neural Netw. 2024 Feb;170:635-649. doi: 10.1016/j.neunet.2023.11.044. Epub 2023 Nov 23.
5
DSFedCon: Dynamic Sparse Federated Contrastive Learning for Data-Driven Intelligent Systems.DSFedCon:用于数据驱动智能系统的动态稀疏联邦对比学习
IEEE Trans Neural Netw Learn Syst. 2025 Feb;36(2):3343-3355. doi: 10.1109/TNNLS.2024.3349400. Epub 2025 Feb 6.
6
Towards Efficient Federated Learning: Layer-Wise Pruning-Quantization Scheme and Coding Design.迈向高效联邦学习:逐层剪枝量化方案与编码设计
Entropy (Basel). 2023 Aug 14;25(8):1205. doi: 10.3390/e25081205.
7
Divide-and-conquer the NAS puzzle in resource-constrained federated learning systems.在资源受限的联邦学习系统中各个击破网络附属存储难题。
Neural Netw. 2023 Nov;168:569-579. doi: 10.1016/j.neunet.2023.10.006. Epub 2023 Oct 7.
8
Contrastive encoder pre-training-based clustered federated learning for heterogeneous data.基于对比编码器预训练的聚类联邦学习用于异构数据。
Neural Netw. 2023 Aug;165:689-704. doi: 10.1016/j.neunet.2023.06.010. Epub 2023 Jun 10.
9
A Cluster-Driven Adaptive Training Approach for Federated Learning.一种基于簇的联邦学习自适应训练方法。
Sensors (Basel). 2022 Sep 18;22(18):7061. doi: 10.3390/s22187061.
10
Federated influencer learning for secure and efficient collaborative learning in realistic medical database environment.联邦式影响者学习在现实医疗数据库环境中的安全高效协同学习。
Sci Rep. 2024 Sep 30;14(1):22729. doi: 10.1038/s41598-024-73863-1.

引用本文的文献

1
Revolutionizing healthcare data analytics with federated learning: A comprehensive survey of applications, systems, and future directions.利用联邦学习革新医疗数据分析:应用、系统及未来方向的全面综述
Comput Struct Biotechnol J. 2025 Jun 11;28:217-238. doi: 10.1016/j.csbj.2025.06.009. eCollection 2025.
2
Application of Federated Learning in Cardiology: Key Challenges and Potential Solutions.联邦学习在心脏病学中的应用:关键挑战与潜在解决方案
Mayo Clin Proc Digit Health. 2024 Oct 11;2(4):590-595. doi: 10.1016/j.mcpdig.2024.09.005. eCollection 2024 Dec.
3
Edge-Cloud Synergy for AI-Enhanced Sensor Network Data: A Real-Time Predictive Maintenance Framework.
用于人工智能增强型传感器网络数据的边缘云协同:一个实时预测性维护框架。
Sensors (Basel). 2024 Dec 11;24(24):7918. doi: 10.3390/s24247918.
4
Efficient federated learning for distributed neuroimaging data.用于分布式神经影像数据的高效联邦学习
Front Neuroinform. 2024 Sep 9;18:1430987. doi: 10.3389/fninf.2024.1430987. eCollection 2024.
5
LF3PFL: A Practical Privacy-Preserving Federated Learning Algorithm Based on Local Federalization Scheme.LF3PFL:一种基于局部联邦化方案的实用隐私保护联邦学习算法。
Entropy (Basel). 2024 Apr 23;26(5):353. doi: 10.3390/e26050353.
6
A Communication-Efficient, Privacy-Preserving Federated Learning Algorithm Based on Two-Stage Gradient Pruning and Differentiated Differential Privacy.一种基于两阶段梯度剪枝和差异化差分隐私的通信高效、隐私保护联邦学习算法。
Sensors (Basel). 2023 Nov 21;23(23):9305. doi: 10.3390/s23239305.
7
Limitations and Future Aspects of Communication Costs in Federated Learning: A Survey.联邦学习中通信成本的局限性与未来展望:一项综述
Sensors (Basel). 2023 Aug 23;23(17):7358. doi: 10.3390/s23177358.
8
Federated learning for 6G-enabled secure communication systems: a comprehensive survey.面向支持6G的安全通信系统的联邦学习:全面综述。
Artif Intell Rev. 2023 Mar 12:1-93. doi: 10.1007/s10462-023-10417-3.
9
Federated Learning for Medical Image Analysis with Deep Neural Networks.用于医学图像分析的深度神经网络联邦学习
Diagnostics (Basel). 2023 Apr 24;13(9):1532. doi: 10.3390/diagnostics13091532.
10
Effective Model Update for Adaptive Classification of Text Streams in a Distributed Learning Environment.在分布式学习环境中对文本流进行自适应分类的有效模型更新。
Sensors (Basel). 2022 Nov 29;22(23):9298. doi: 10.3390/s22239298.