Suppr超能文献

在资源受限的联邦学习系统中各个击破网络附属存储难题。

Divide-and-conquer the NAS puzzle in resource-constrained federated learning systems.

作者信息

Venkatesha Yeshwanth, Kim Youngeun, Park Hyoungseob, Panda Priyadarshini

机构信息

Department of Electrical Engineering, Yale University, New Haven, CT, USA.

Department of Electrical Engineering, Yale University, New Haven, CT, USA.

出版信息

Neural Netw. 2023 Nov;168:569-579. doi: 10.1016/j.neunet.2023.10.006. Epub 2023 Oct 7.

Abstract

Federated Learning (FL) is a privacy-preserving distributed machine learning approach geared towards applications in edge devices. However, the problem of designing custom neural architectures in federated environments is not tackled from the perspective of overall system efficiency. In this paper, we propose DC-NAS-a divide-and-conquer approach that performs supernet-based Neural Architecture Search (NAS) in a federated system by systematically sampling the search space. We propose a novel diversified sampling strategy that balances exploration and exploitation of the search space by initially maximizing the distance between the samples and progressively shrinking this distance as the training progresses. We then perform channel pruning to reduce the training complexity at the devices further. We show that our approach outperforms several sampling strategies including Hadamard sampling, where the samples are maximally separated. We evaluate our method on the CIFAR10, CIFAR100, EMNIST, and TinyImagenet benchmarks and show a comprehensive analysis of different aspects of federated learning such as scalability, and non-IID data. DC-NAS achieves near iso-accuracy as compared to full-scale federated NAS with 50% fewer resources.

摘要

联邦学习(FL)是一种注重隐私保护的分布式机器学习方法,适用于边缘设备中的应用。然而,在联邦环境中设计定制神经架构的问题尚未从整体系统效率的角度得到解决。在本文中,我们提出了DC-NAS——一种分治方法,通过系统地对搜索空间进行采样,在联邦系统中执行基于超网的神经架构搜索(NAS)。我们提出了一种新颖的多样化采样策略,通过在初始阶段最大化样本之间的距离,并随着训练的进行逐步缩小该距离,来平衡对搜索空间的探索和利用。然后,我们进行通道剪枝以进一步降低设备上的训练复杂度。我们表明,我们的方法优于包括哈达玛采样在内的几种采样策略,在哈达玛采样中样本是最大程度分离的。我们在CIFAR10、CIFAR100、EMNIST和TinyImagenet基准上评估了我们的方法,并对联邦学习的不同方面进行了全面分析,如可扩展性和非独立同分布数据。与全规模联邦NAS相比,DC-NAS在资源减少50%的情况下实现了近乎相同的准确率。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验