• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

协同学习贝叶斯优化

Co-Learning Bayesian Optimization.

作者信息

Guo Zhendong, Ong Yew-Soon, He Tiantian, Liu Haitao

出版信息

IEEE Trans Cybern. 2022 Sep;52(9):9820-9833. doi: 10.1109/TCYB.2022.3168551. Epub 2022 Aug 18.

DOI:10.1109/TCYB.2022.3168551
PMID:35687641
Abstract

Bayesian optimization (BO) is well known to be sample efficient for solving black-box problems. However, BO algorithms may get stuck in suboptimal solutions even with plenty of samples. Intrinsically, such a suboptimal problem of BO can attribute to the poor surrogate accuracy of the trained Gaussian process (GP), particularly that in the regions where the optimal solutions locate. Hence, we propose to build multiple GP models instead of a single GP surrogate to complement each other, thus resolving the suboptimal problem of BO. Nevertheless, according to the bias-variance tradeoff equation, the individual prediction errors can increase when increasing the diversity of models, which may lead even worse overall surrogate accuracy. On the other hand, based on the theory of the Rademacher complexity, it has been proven that exploiting the agreement of models on unlabeled information can reduce the complexity of hypothesis space, therefore achieving the required surrogate accuracy with fewer samples. Such value of model agreement has been extensively demonstrated for co-training style algorithms to boost model accuracy with a small portion of samples. Inspired by the above, we propose a novel BO algorithm labeled as co-learning BO (CLBO), which exploits both model diversity and agreement on unlabeled information to improve the overall surrogate accuracy with limited samples, therefore achieving more efficient global optimization. Through tests on five numerical toy problems and three engineering benchmarks, the effectiveness of the proposed CLBO has been well demonstrated.

摘要

贝叶斯优化(BO)以解决黑箱问题时样本效率高而闻名。然而,即使有大量样本,BO算法也可能陷入次优解。本质上,BO的这种次优问题可归因于训练的高斯过程(GP)的代理精度较差,尤其是在最优解所在的区域。因此,我们建议构建多个GP模型而不是单个GP代理来相互补充,从而解决BO的次优问题。然而,根据偏差-方差权衡方程,增加模型的多样性时,个体预测误差可能会增加,这可能导致整体代理精度更差。另一方面,基于拉德马赫复杂度理论,已证明利用模型在未标记信息上的一致性可以降低假设空间的复杂度,从而用更少的样本实现所需的代理精度。模型一致性的这种价值已在协同训练风格的算法中得到广泛证明,以用一小部分样本提高模型精度。受上述启发,我们提出了一种名为协同学习BO(CLBO)的新型BO算法,该算法利用模型多样性和在未标记信息上的一致性,以在有限样本的情况下提高整体代理精度,从而实现更高效的全局优化。通过对五个数值玩具问题和三个工程基准的测试,所提出的CLBO的有效性得到了充分证明。

相似文献

1
Co-Learning Bayesian Optimization.协同学习贝叶斯优化
IEEE Trans Cybern. 2022 Sep;52(9):9820-9833. doi: 10.1109/TCYB.2022.3168551. Epub 2022 Aug 18.
2
Funneled Bayesian Optimization for Design, Tuning and Control of Autonomous Systems.漏斗贝叶斯优化在自主系统设计、调优和控制中的应用。
IEEE Trans Cybern. 2019 Apr;49(4):1489-1500. doi: 10.1109/TCYB.2018.2805695. Epub 2018 Feb 27.
3
Bayesian Optimization Based on K-Optimality.基于K-最优性的贝叶斯优化
Entropy (Basel). 2018 Aug 9;20(8):594. doi: 10.3390/e20080594.
4
Surrogate Modeling for Bayesian Optimization Beyond a Single Gaussian Process.超越单高斯过程的贝叶斯优化代理建模
IEEE Trans Pattern Anal Mach Intell. 2023 Sep;45(9):11283-11296. doi: 10.1109/TPAMI.2023.3264741. Epub 2023 Aug 7.
5
Bayesian Optimization for Efficient Prediction of Gas Uptake in Nanoporous Materials.用于高效预测纳米多孔材料中气体吸收的贝叶斯优化
Chemphyschem. 2024 Aug 19;25(16):e202300850. doi: 10.1002/cphc.202300850. Epub 2024 Jul 24.
6
Solving Expensive Optimization Problems in Dynamic Environments With Meta-Learning.利用元学习解决动态环境中的昂贵优化问题。
IEEE Trans Cybern. 2024 Dec;54(12):7430-7442. doi: 10.1109/TCYB.2024.3443396. Epub 2024 Nov 27.
7
Stochastic machine learning via sigma profiles to build a digital chemical space.通过西格玛分布图的随机机器学习构建数字化学空间。
Proc Natl Acad Sci U S A. 2024 Jul 30;121(31):e2404676121. doi: 10.1073/pnas.2404676121. Epub 2024 Jul 23.
8
Global optimization ensemble model for classification methods.
ScientificWorldJournal. 2014;2014:313164. doi: 10.1155/2014/313164. Epub 2014 Apr 27.
9
Geoacoustic inversion using Bayesian optimization with a Gaussian process surrogate model.使用带有高斯过程代理模型的贝叶斯优化进行地声学反演。
J Acoust Soc Am. 2024 Aug 1;156(2):812-822. doi: 10.1121/10.0028177.
10
Generative Multiform Bayesian Optimization.生成式多形态贝叶斯优化。
IEEE Trans Cybern. 2023 Jul;53(7):4347-4360. doi: 10.1109/TCYB.2022.3165044. Epub 2023 Jun 15.

引用本文的文献

1
Interpretable Machine Learning for Predicting Neoadjuvant Chemotherapy Response in Breast Cancer Using the Baseline Clinical and Pathological Characteristics.利用基线临床和病理特征进行可解释的机器学习以预测乳腺癌新辅助化疗反应
Cancer Med. 2025 Sep;14(17):e71221. doi: 10.1002/cam4.71221.