Suppr超能文献

联邦学习可在不共享数据的情况下提高多中心深度学习的站点性能。

Federated learning improves site performance in multicenter deep learning without data sharing.

机构信息

Department of Radiological Sciences, University of California, Los Angeles, Los Angeles, California, USA.

Department of Bioengineering, University of California, Los Angeles, Los Angeles, California, USA.

出版信息

J Am Med Inform Assoc. 2021 Jun 12;28(6):1259-1264. doi: 10.1093/jamia/ocaa341.

Abstract

OBJECTIVE

To demonstrate enabling multi-institutional training without centralizing or sharing the underlying physical data via federated learning (FL).

MATERIALS AND METHODS

Deep learning models were trained at each participating institution using local clinical data, and an additional model was trained using FL across all of the institutions.

RESULTS

We found that the FL model exhibited superior performance and generalizability to the models trained at single institutions, with an overall performance level that was significantly better than that of any of the institutional models alone when evaluated on held-out test sets from each institution and an outside challenge dataset.

DISCUSSION

The power of FL was successfully demonstrated across 3 academic institutions while avoiding the privacy risk associated with the transfer and pooling of patient data.

CONCLUSION

Federated learning is an effective methodology that merits further study to enable accelerated development of models across institutions, enabling greater generalizability in clinical use.

摘要

目的

通过联邦学习(FL)实现无需集中或共享基础物理数据的多机构培训。

材料和方法

在每个参与机构中使用本地临床数据训练深度学习模型,并使用 FL 在所有机构中训练另一个模型。

结果

我们发现,FL 模型的表现优于单一机构训练的模型,具有更好的通用性,在对来自每个机构的保留测试集和外部挑战数据集进行评估时,其整体性能水平明显优于任何单一机构模型。

讨论

在 3 个学术机构中成功展示了 FL 的强大功能,同时避免了与患者数据传输和汇集相关的隐私风险。

结论

联邦学习是一种有效的方法,值得进一步研究,以实现跨机构模型的快速发展,从而在临床应用中实现更大的通用性。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/ca9e/8200268/ff89a90d546c/ocaa341f1.jpg

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验