Suppr超能文献

联邦学习中的差异化隐私知识转移。

Differentially private knowledge transfer for federated learning.

机构信息

Department of Electronic Engineering, Tsinghua University, 100084, Beijing, China.

Microsoft Research Asia, 100080, Beijing, China.

出版信息

Nat Commun. 2023 Jun 24;14(1):3785. doi: 10.1038/s41467-023-38794-x.

Abstract

Extracting useful knowledge from big data is important for machine learning. When data is privacy-sensitive and cannot be directly collected, federated learning is a promising option that extracts knowledge from decentralized data by learning and exchanging model parameters, rather than raw data. However, model parameters may encode not only non-private knowledge but also private information of local data, thereby transferring knowledge via model parameters is not privacy-secure. Here, we present a knowledge transfer method named PrivateKT, which uses actively selected small public data to transfer high-quality knowledge in federated learning with privacy guarantees. We verify PrivateKT on three different datasets, and results show that PrivateKT can maximally reduce 84% of the performance gap between centralized learning and existing federated learning methods under strict differential privacy restrictions. PrivateKT provides a potential direction to effective and privacy-preserving knowledge transfer in machine intelligent systems.

摘要

从大数据中提取有用的知识对于机器学习非常重要。当数据涉及隐私且无法直接收集时,联邦学习是一种很有前途的选择,它通过学习和交换模型参数而不是原始数据,从分散的数据中提取知识。然而,模型参数不仅可以编码非隐私知识,还可以编码本地数据的隐私信息,因此通过模型参数传输知识是不安全的。在这里,我们提出了一种名为 PrivateKT 的知识转移方法,它使用主动选择的小公共数据在具有隐私保护的联邦学习中传输高质量的知识。我们在三个不同的数据集上验证了 PrivateKT,结果表明,在严格的差分隐私限制下,PrivateKT 可以最大程度地将集中式学习与现有联邦学习方法之间的性能差距缩小 84%。PrivateKT 为机器智能系统中有效且隐私保护的知识转移提供了一个潜在的方向。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/22e4/10290720/ae6323767e20/41467_2023_38794_Fig1_HTML.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验