Suppr超能文献

多目标进化联邦学习。

Multi-Objective Evolutionary Federated Learning.

出版信息

IEEE Trans Neural Netw Learn Syst. 2020 Apr;31(4):1310-1322. doi: 10.1109/TNNLS.2019.2919699. Epub 2019 Jun 24.

Abstract

Federated learning is an emerging technique used to prevent the leakage of private information. Unlike centralized learning that needs to collect data from users and store them collectively on a cloud server, federated learning makes it possible to learn a global model while the data are distributed on the users' devices. However, compared with the traditional centralized approach, the federated setting consumes considerable communication resources of the clients, which is indispensable for updating global models and prevents this technique from being widely used. In this paper, we aim to optimize the structure of the neural network models in federated learning using a multi-objective evolutionary algorithm to simultaneously minimize the communication costs and the global model test errors. A scalable method for encoding network connectivity is adapted to federated learning to enhance the efficiency in evolving deep neural networks. Experimental results on both multilayer perceptrons and convolutional neural networks indicate that the proposed optimization method is able to find optimized neural network models that can not only significantly reduce communication costs but also improve the learning performance of federated learning compared with the standard fully connected neural networks.

摘要

联邦学习是一种新兴的技术,用于防止私人信息的泄露。与需要从用户那里收集数据并将其集中存储在云服务器上的集中式学习不同,联邦学习使得在数据分布在用户设备上的同时,学习全局模型成为可能。然而,与传统的集中式方法相比,联邦设置消耗了客户端相当多的通信资源,这对于更新全局模型是必不可少的,这也阻止了该技术的广泛应用。在本文中,我们旨在使用多目标进化算法优化联邦学习中的神经网络模型结构,以同时最小化通信成本和全局模型测试误差。我们采用了一种可扩展的方法来对网络连接进行编码,以提高在进化深度神经网络时的效率。在多层感知器和卷积神经网络上的实验结果表明,所提出的优化方法能够找到优化的神经网络模型,不仅可以显著降低通信成本,而且还可以提高联邦学习的学习性能,与标准的全连接神经网络相比。

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验