Suppr超能文献

基于图神经网络的去中心化学习方案。

A Graph Neural Network Based Decentralized Learning Scheme.

机构信息

College of Information Science and Electronic Engineering, Zhejiang University, Hangzhou 310027, China.

Zhejiang Provincial Key Laboratory of Information Processing, Communication and Networking (IPCAN), Hangzhou 310027, China.

出版信息

Sensors (Basel). 2022 Jan 28;22(3):1030. doi: 10.3390/s22031030.

Abstract

As an emerging paradigm considering data privacy and transmission efficiency, decentralized learning aims to acquire a global model using the training data distributed over many user devices. It is a challenging problem since link loss, partial device participation, and non-independent and identically distributed (non-iid) data distribution would all deteriorate the performance of decentralized learning algorithms. Existing work may restrict to linear models or show poor performance over non-iid data. Therefore, in this paper, we propose a decentralized learning scheme based on distributed parallel stochastic gradient descent (DPSGD) and graph neural network (GNN) to deal with the above challenges. Specifically, each user device participating in the learning task utilizes local training data to compute local stochastic gradients and updates its own local model. Then, each device utilizes the GNN model and exchanges the model parameters with its neighbors to reach the average of resultant global models. The iteration repeats until the algorithm converges. Extensive simulation results over both iid and non-iid data validate the algorithm's convergence to near optimal results and robustness to both link loss and partial device participation.

摘要

作为一种考虑数据隐私和传输效率的新兴范例,去中心化学习旨在使用分布在许多用户设备上的训练数据获取全局模型。由于链路丢失、部分设备参与以及非独立同分布(non-iid)数据分布,这是一个具有挑战性的问题,所有这些都会降低去中心化学习算法的性能。现有工作可能仅限于线性模型,或者在非iid 数据上表现不佳。因此,在本文中,我们提出了一种基于分布式并行随机梯度下降(DPSGD)和图神经网络(GNN)的去中心化学习方案,以应对上述挑战。具体来说,参与学习任务的每个用户设备都利用本地训练数据来计算本地随机梯度,并更新其本地模型。然后,每个设备利用 GNN 模型并与邻居交换模型参数,以达到最终全局模型的平均值。算法迭代重复,直到收敛。在 iid 和 non-iid 数据上的大量仿真结果验证了算法收敛到接近最优结果的能力,以及对链路丢失和部分设备参与的鲁棒性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/01f4/8839979/1c9f641e92ac/sensors-22-01030-g001.jpg

相似文献

1
A Graph Neural Network Based Decentralized Learning Scheme.
Sensors (Basel). 2022 Jan 28;22(3):1030. doi: 10.3390/s22031030.
2
A federated graph neural network framework for privacy-preserving personalization.
Nat Commun. 2022 Jun 2;13(1):3091. doi: 10.1038/s41467-022-30714-9.
3
Subgraph-level federated graph neural network for privacy-preserving recommendation with meta-learning.
Neural Netw. 2024 Nov;179:106574. doi: 10.1016/j.neunet.2024.106574. Epub 2024 Jul 25.
4
Personalized On-Device E-Health Analytics With Decentralized Block Coordinate Descent.
IEEE J Biomed Health Inform. 2022 Jun;26(6):2778-2786. doi: 10.1109/JBHI.2022.3140455. Epub 2022 Jun 3.
5
Low precision decentralized distributed training over IID and non-IID data.
Neural Netw. 2022 Nov;155:451-460. doi: 10.1016/j.neunet.2022.08.032. Epub 2022 Sep 6.
6
A(DP) SGD: Asynchronous Decentralized Parallel Stochastic Gradient Descent With Differential Privacy.
IEEE Trans Pattern Anal Mach Intell. 2022 Nov;44(11):8036-8047. doi: 10.1109/TPAMI.2021.3107796. Epub 2022 Oct 4.
7
Federated learning using model projection for multi-center disease diagnosis with non-IID data.
Neural Netw. 2024 Oct;178:106409. doi: 10.1016/j.neunet.2024.106409. Epub 2024 May 24.
10
Robust and Privacy-Preserving Decentralized Deep Federated Learning Training: Focusing on Digital Healthcare Applications.
IEEE/ACM Trans Comput Biol Bioinform. 2024 Jul-Aug;21(4):890-901. doi: 10.1109/TCBB.2023.3243932. Epub 2024 Aug 8.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验