• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于启发式的联邦学习与自适应超参数调整用于家庭能源预测。

Heuristic based federated learning with adaptive hyperparameter tuning for households energy prediction.

作者信息

Toderean Liana, Daian Mihai, Cioara Tudor, Anghel Ionut, Michalakopoulos Vasilis, Sarantinopoulos Efstathios, Sarmas Elissaios

机构信息

Distributed Systems Research Laboratory, Computer Science Department, Technical University of Cluj-Napoca, G. Barițiu 26-28, Cluj-Napoca, 400027, Romania.

Decision Support Systems Laboratory, School of Electrical & Computer Engineering, National Technical University of Athens, Ir. Politechniou 9, Athens, 157 73, Greece.

出版信息

Sci Rep. 2025 Apr 12;15(1):12564. doi: 10.1038/s41598-025-96443-3.

DOI:10.1038/s41598-025-96443-3
PMID:40221586
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC11993608/
Abstract

Federated Learning is transforming electrical load forecasting by enabling Artificial Intelligence (AI) models to be trained directly on household edge devices. However, the prediction accuracy of federated learning models tends to diminish when dealing with non-IID data highlighting the need for adaptive hyperparameter optimization strategies to improve performance. In this paper, we propose a novel hierarchical federated learning solution for efficient model aggregation and hyperparameter tuning, specifically tailored to household energy prediction. The households with similar energy profiles are clustered at the edge, linked, and aggregated at the fog level, to enable effective and adaptive hyperparameter tuning. The federated model aggregation is optimized using hierarchical simulated annealing optimization to prioritize updates from the better-performing models. A genetic algorithm-based hyperparameter optimization method reduces the computational load on edge nodes by efficiently exploring different configurations and using only the most promising ones for edge nodes' cross-validation. The evaluation results demonstrate a significant improvement in average prediction accuracy and better capturing of energy patterns compared to the federated averaging approach. The impact on network traffic among nodes across different layers is kept below 30 KB. Additionally, hyperparameter tuning reduces the size of model updates and the number of communication rounds by 30%, which is particularly beneficial when network resources are limited.

摘要

联邦学习正在通过使人工智能(AI)模型能够直接在家庭边缘设备上进行训练来改变电力负荷预测。然而,在处理非独立同分布(non-IID)数据时,联邦学习模型的预测准确性往往会降低,这凸显了需要采用自适应超参数优化策略来提高性能。在本文中,我们提出了一种新颖的分层联邦学习解决方案,用于高效的模型聚合和超参数调整,特别针对家庭能源预测进行了定制。具有相似能源配置文件的家庭在边缘进行聚类、链接,并在雾层进行聚合,以实现有效的自适应超参数调整。使用分层模拟退火优化对联邦模型聚合进行优化,以优先处理性能更好的模型的更新。一种基于遗传算法的超参数优化方法通过有效探索不同配置并仅使用最有前景的配置进行边缘节点的交叉验证,从而降低了边缘节点的计算负荷。评估结果表明,与联邦平均方法相比,平均预测准确性有显著提高,并且能够更好地捕捉能源模式。对不同层节点之间网络流量的影响保持在30KB以下。此外,超参数调整将模型更新的大小和通信轮数减少了30%,这在网络资源有限时特别有益。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/d453e372be5a/41598_2025_96443_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/1a3a72909e32/41598_2025_96443_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/ed5f94a32c30/41598_2025_96443_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/ce502f782780/41598_2025_96443_Figa_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/b59f344e7eb4/41598_2025_96443_Figb_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/060738e7f818/41598_2025_96443_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/c8ee48a77965/41598_2025_96443_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/f0e710f9c489/41598_2025_96443_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/f4634792f518/41598_2025_96443_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/c249fb218366/41598_2025_96443_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/b51071817c57/41598_2025_96443_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/5ed0e375180b/41598_2025_96443_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/8e35ad3a1ba0/41598_2025_96443_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/ef276bd19ecd/41598_2025_96443_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/d453e372be5a/41598_2025_96443_Fig12_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/1a3a72909e32/41598_2025_96443_Fig1_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/ed5f94a32c30/41598_2025_96443_Fig2_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/ce502f782780/41598_2025_96443_Figa_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/b59f344e7eb4/41598_2025_96443_Figb_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/060738e7f818/41598_2025_96443_Fig3_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/c8ee48a77965/41598_2025_96443_Fig4_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/f0e710f9c489/41598_2025_96443_Fig5_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/f4634792f518/41598_2025_96443_Fig6_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/c249fb218366/41598_2025_96443_Fig7_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/b51071817c57/41598_2025_96443_Fig8_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/5ed0e375180b/41598_2025_96443_Fig9_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/8e35ad3a1ba0/41598_2025_96443_Fig10_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/ef276bd19ecd/41598_2025_96443_Fig11_HTML.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/2aee/11993608/d453e372be5a/41598_2025_96443_Fig12_HTML.jpg

相似文献

1
Heuristic based federated learning with adaptive hyperparameter tuning for households energy prediction.基于启发式的联邦学习与自适应超参数调整用于家庭能源预测。
Sci Rep. 2025 Apr 12;15(1):12564. doi: 10.1038/s41598-025-96443-3.
2
Genetic CFL: Hyperparameter Optimization in Clustered Federated Learning.遗传 CFL:聚类联邦学习中的超参数优化。
Comput Intell Neurosci. 2021 Nov 18;2021:7156420. doi: 10.1155/2021/7156420. eCollection 2021.
3
Federated Reinforcement Learning-Based Dynamic Resource Allocation and Task Scheduling in Edge for IoT Applications.用于物联网应用的边缘环境中基于联邦强化学习的动态资源分配与任务调度
Sensors (Basel). 2025 Mar 30;25(7):2197. doi: 10.3390/s25072197.
4
Machine Learning-Based Boosted Regression Ensemble Combined with Hyperparameter Tuning for Optimal Adaptive Learning.基于机器学习的增强回归集成与超参数调整相结合,实现最优自适应学习。
Sensors (Basel). 2022 May 16;22(10):3776. doi: 10.3390/s22103776.
5
Adaptive federated learning for resource-constrained IoT devices through edge intelligence and multi-edge clustering.通过边缘智能和多边缘聚类实现资源受限物联网设备的自适应联邦学习。
Sci Rep. 2024 Nov 20;14(1):28746. doi: 10.1038/s41598-024-78239-z.
6
FedDyH: A Multi-Policy with GA Optimization Framework for Dynamic Heterogeneous Federated Learning.FedDyH:一种用于动态异构联邦学习的具有遗传算法优化框架的多策略
Biomimetics (Basel). 2025 Mar 17;10(3):185. doi: 10.3390/biomimetics10030185.
7
FedPSO: Federated Learning Using Particle Swarm Optimization to Reduce Communication Costs.联邦粒子群优化算法:利用粒子群优化进行联邦学习以降低通信成本
Sensors (Basel). 2021 Jan 16;21(2):600. doi: 10.3390/s21020600.
8
Heuristic hyperparameter optimization of deep learning models for genomic prediction.启发式深度学习模型的基因组预测超参数优化。
G3 (Bethesda). 2021 Jul 14;11(7). doi: 10.1093/g3journal/jkab032.
9
Communication-Efficient Hybrid Federated Learning for E-Health With Horizontal and Vertical Data Partitioning.用于电子健康的具有水平和垂直数据分区的通信高效混合联邦学习
IEEE Trans Neural Netw Learn Syst. 2025 Mar;36(3):5614-5628. doi: 10.1109/TNNLS.2024.3383748. Epub 2025 Feb 28.
10
Intelligent deep federated learning model for enhancing security in internet of things enabled edge computing environment.用于增强物联网支持的边缘计算环境安全性的智能深度联邦学习模型。
Sci Rep. 2025 Feb 3;15(1):4041. doi: 10.1038/s41598-025-88163-5.

本文引用的文献

1
FedMed: A Federated Learning Framework for Language Modeling.FedMed:一种用于语言模型的联邦学习框架。
Sensors (Basel). 2020 Jul 21;20(14):4048. doi: 10.3390/s20144048.
2
Federated learning of predictive models from federated Electronic Health Records.从联邦电子健康记录中联合学习预测模型。
Int J Med Inform. 2018 Apr;112:59-67. doi: 10.1016/j.ijmedinf.2018.01.007. Epub 2018 Jan 12.
3
Optimization by simulated annealing.模拟退火优化。
Science. 1983 May 13;220(4598):671-80. doi: 10.1126/science.220.4598.671.