• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

联邦学习的概率预测

Probabilistic Predictions with Federated Learning.

作者信息

Thorgeirsson Adam Thor, Gauterin Frank

机构信息

Dr. Ing. h.c. F. Porsche AG, 71287 Weissach, Germany.

Institute of Vehicle System Technology, Karlsruhe Institute of Technology, 76131 Karlsruhe, Germany.

出版信息

Entropy (Basel). 2020 Dec 30;23(1):41. doi: 10.3390/e23010041.

DOI:10.3390/e23010041
PMID:33396677
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7823259/
Abstract

Probabilistic predictions with machine learning are important in many applications. These are commonly done with Bayesian learning algorithms. However, Bayesian learning methods are computationally expensive in comparison with non-Bayesian methods. Furthermore, the data used to train these algorithms are often distributed over a large group of end devices. Federated learning can be applied in this setting in a communication-efficient and privacy-preserving manner but does not include predictive uncertainty. To represent predictive uncertainty in federated learning, our suggestion is to introduce uncertainty in the aggregation step of the algorithm by treating the set of local weights as a posterior distribution for the weights of the global model. We compare our approach to state-of-the-art Bayesian and non-Bayesian probabilistic learning algorithms. By applying proper scoring rules to evaluate the predictive distributions, we show that our approach can achieve similar performance as the benchmark would achieve in a non-distributed setting.

摘要

机器学习的概率预测在许多应用中都很重要。这些通常通过贝叶斯学习算法来完成。然而,与非贝叶斯方法相比,贝叶斯学习方法的计算成本很高。此外,用于训练这些算法的数据通常分布在大量终端设备上。联邦学习可以以通信高效且保护隐私的方式应用于这种情况,但不包括预测不确定性。为了在联邦学习中表示预测不确定性,我们的建议是在算法的聚合步骤中引入不确定性,将局部权重集视为全局模型权重的后验分布。我们将我们的方法与最先进的贝叶斯和非贝叶斯概率学习算法进行比较。通过应用适当的评分规则来评估预测分布,我们表明我们的方法可以实现与基准在非分布式设置中所能达到的类似性能。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/527b/7823259/fa6ee09f0ba0/entropy-23-00041-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/527b/7823259/8aacafcdd026/entropy-23-00041-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/527b/7823259/53d9d801044d/entropy-23-00041-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/527b/7823259/fa6ee09f0ba0/entropy-23-00041-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/527b/7823259/8aacafcdd026/entropy-23-00041-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/527b/7823259/53d9d801044d/entropy-23-00041-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/527b/7823259/fa6ee09f0ba0/entropy-23-00041-g003.jpg

相似文献

1
Probabilistic Predictions with Federated Learning.联邦学习的概率预测
Entropy (Basel). 2020 Dec 30;23(1):41. doi: 10.3390/e23010041.
2
Ternary Compression for Communication-Efficient Federated Learning.用于通信高效联邦学习的三元压缩
IEEE Trans Neural Netw Learn Syst. 2022 Mar;33(3):1162-1176. doi: 10.1109/TNNLS.2020.3041185. Epub 2022 Feb 28.
3
Multi-site fMRI analysis using privacy-preserving federated learning and domain adaptation: ABIDE results.使用隐私保护联邦学习和域适应的多站点功能磁共振成像分析:ABIDE研究结果
Med Image Anal. 2020 Oct;65:101765. doi: 10.1016/j.media.2020.101765. Epub 2020 Jul 2.
4
Aggregation Strategy on Federated Machine Learning Algorithm for Collaborative Predictive Maintenance.联邦机器学习算法在协同预测性维护中的聚合策略。
Sensors (Basel). 2022 Aug 19;22(16):6252. doi: 10.3390/s22166252.
5
Privacy-Preserving Federated Survival Support Vector Machines for Cross-Institutional Time-To-Event Analysis: Algorithm Development and Validation.用于跨机构事件发生时间分析的隐私保护联合生存支持向量机:算法开发与验证
JMIR AI. 2024 Mar 29;3:e47652. doi: 10.2196/47652.
6
FedPSO: Federated Learning Using Particle Swarm Optimization to Reduce Communication Costs.联邦粒子群优化算法:利用粒子群优化进行联邦学习以降低通信成本
Sensors (Basel). 2021 Jan 16;21(2):600. doi: 10.3390/s21020600.
7
An EMD-Based Adaptive Client Selection Algorithm for Federated Learning in Heterogeneous Data Scenarios.一种基于经验模态分解的异构数据场景下联邦学习自适应客户端选择算法
Front Plant Sci. 2022 Jun 9;13:908814. doi: 10.3389/fpls.2022.908814. eCollection 2022.
8
DWFed: A statistical- heterogeneity-based dynamic weighted model aggregation algorithm for federated learning.DWFed:一种基于统计异质性的联邦学习动态加权模型聚合算法。
Front Neurorobot. 2022 Nov 24;16:1041553. doi: 10.3389/fnbot.2022.1041553. eCollection 2022.
9
Communication-Efficient Federated Deep Learning With Layerwise Asynchronous Model Update and Temporally Weighted Aggregation.基于分层异步模型更新和时间加权聚合的通信高效联邦深度学习
IEEE Trans Neural Netw Learn Syst. 2020 Oct;31(10):4229-4238. doi: 10.1109/TNNLS.2019.2953131. Epub 2019 Dec 30.
10
Federated learning of predictive models from federated Electronic Health Records.从联邦电子健康记录中联合学习预测模型。
Int J Med Inform. 2018 Apr;112:59-67. doi: 10.1016/j.ijmedinf.2018.01.007. Epub 2018 Jan 12.

引用本文的文献

1
Neural Network Used for the Fusion of Predictions Obtained by the K-Nearest Neighbors Algorithm Based on Independent Data Sources.用于融合基于独立数据源的K近邻算法所获预测结果的神经网络。
Entropy (Basel). 2021 Nov 25;23(12):1568. doi: 10.3390/e23121568.
2
Toward Learning Trustworthily from Data Combining Privacy, Fairness, and Explainability: An Application to Face Recognition.迈向从融合隐私、公平性和可解释性的数据中可靠学习:人脸识别应用
Entropy (Basel). 2021 Aug 14;23(8):1047. doi: 10.3390/e23081047.

本文引用的文献

1
Averaging Is Probably Not the Optimum Way of Aggregating Parameters in Federated Learning.平均法可能不是联邦学习中聚合参数的最佳方式。
Entropy (Basel). 2020 Mar 11;22(3):314. doi: 10.3390/e22030314.