• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于K-最优性的贝叶斯优化

Bayesian Optimization Based on K-Optimality.

作者信息

Yan Liang, Duan Xiaojun, Liu Bowen, Xu Jin

机构信息

College of Liberal Arts and Sciences, National University of Defense Technology, Changsha 410000, China.

出版信息

Entropy (Basel). 2018 Aug 9;20(8):594. doi: 10.3390/e20080594.

DOI:10.3390/e20080594
PMID:33265683
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7513107/
Abstract

Bayesian optimization (BO) based on the Gaussian process (GP) surrogate model has attracted extensive attention in the field of optimization and design of experiments (DoE). It usually faces two problems: the unstable GP prediction due to the ill-conditioned Gram matrix of the kernel and the difficulty of determining the trade-off parameter between exploitation and exploration. To solve these problems, we investigate the K-optimality, aiming at minimizing the condition number. Firstly, the Sequentially Bayesian K-optimal design (SBKO) is proposed to ensure the stability of the GP prediction, where the K-optimality is given as the acquisition function. We show that the SBKO reduces the integrated posterior variance and maximizes the hyper-parameters' information gain simultaneously. Secondly, a K-optimal enhanced Bayesian Optimization (KO-BO) approach is given for the optimization problems, where the K-optimality is used to define the trade-off balance parameters which can be output automatically. Specifically, we focus our study on the K-optimal enhanced Expected Improvement algorithm (KO-EI). Numerical examples show that the SBKO generally outperforms the Monte Carlo, Latin hypercube sampling, and sequential DoE approaches by maximizing the posterior variance with the highest precision of prediction. Furthermore, the study of the optimization problem shows that the KO-EI method beats the classical EI method due to its higher convergence rate and smaller variance.

摘要

基于高斯过程(GP)代理模型的贝叶斯优化(BO)在优化与实验设计(DoE)领域引起了广泛关注。它通常面临两个问题:由于核的Gram矩阵病态导致的GP预测不稳定,以及确定利用和探索之间权衡参数的困难。为了解决这些问题,我们研究了K-最优性,旨在最小化条件数。首先,提出了顺序贝叶斯K-最优设计(SBKO)以确保GP预测的稳定性,其中将K-最优性作为采集函数。我们表明,SBKO降低了积分后验方差并同时最大化了超参数的信息增益。其次,针对优化问题给出了一种K-最优增强贝叶斯优化(KO-BO)方法,其中K-最优性用于定义可自动输出的权衡平衡参数。具体而言,我们重点研究了K-最优增强期望改进算法(KO-EI)。数值例子表明,SBKO通过以最高预测精度最大化后验方差,总体上优于蒙特卡罗、拉丁超立方抽样和顺序DoE方法。此外,对优化问题的研究表明,KO-EI方法由于其更高的收敛速度和更小的方差而优于经典EI方法。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/54e8/7513107/28acdd25d14e/entropy-20-00594-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/54e8/7513107/c92e92650ed2/entropy-20-00594-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/54e8/7513107/1ac8de57342d/entropy-20-00594-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/54e8/7513107/fe90911300ad/entropy-20-00594-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/54e8/7513107/28acdd25d14e/entropy-20-00594-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/54e8/7513107/c92e92650ed2/entropy-20-00594-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/54e8/7513107/1ac8de57342d/entropy-20-00594-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/54e8/7513107/fe90911300ad/entropy-20-00594-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/54e8/7513107/28acdd25d14e/entropy-20-00594-g004.jpg

相似文献

1
Bayesian Optimization Based on K-Optimality.基于K-最优性的贝叶斯优化
Entropy (Basel). 2018 Aug 9;20(8):594. doi: 10.3390/e20080594.
2
Surrogate Modeling for Bayesian Optimization Beyond a Single Gaussian Process.超越单高斯过程的贝叶斯优化代理建模
IEEE Trans Pattern Anal Mach Intell. 2023 Sep;45(9):11283-11296. doi: 10.1109/TPAMI.2023.3264741. Epub 2023 Aug 7.
3
Funneled Bayesian Optimization for Design, Tuning and Control of Autonomous Systems.漏斗贝叶斯优化在自主系统设计、调优和控制中的应用。
IEEE Trans Cybern. 2019 Apr;49(4):1489-1500. doi: 10.1109/TCYB.2018.2805695. Epub 2018 Feb 27.
4
Bayesian optimization with Gaussian process surrogate model for source localization.
J Acoust Soc Am. 2023 Sep 1;154(3):1459-1470. doi: 10.1121/10.0020839.
5
Gaussian Process Based Expected Information Gain Computation for Bayesian Optimal Design.基于高斯过程的贝叶斯最优设计的期望信息增益计算
Entropy (Basel). 2020 Feb 24;22(2):258. doi: 10.3390/e22020258.
6
ADMMBO: Bayesian Optimization with Unknown Constraints using ADMM.ADMMBO:使用交替方向乘子法(ADMM)处理未知约束的贝叶斯优化
J Mach Learn Res. 2019;20.
7
Geoacoustic inversion using Bayesian optimization with a Gaussian process surrogate model.使用带有高斯过程代理模型的贝叶斯优化进行地声学反演。
J Acoust Soc Am. 2024 Aug 1;156(2):812-822. doi: 10.1121/10.0028177.
8
Bayesian Optimization for Efficient Prediction of Gas Uptake in Nanoporous Materials.用于高效预测纳米多孔材料中气体吸收的贝叶斯优化
Chemphyschem. 2024 Aug 19;25(16):e202300850. doi: 10.1002/cphc.202300850. Epub 2024 Jul 24.
9
Co-Learning Bayesian Optimization.协同学习贝叶斯优化
IEEE Trans Cybern. 2022 Sep;52(9):9820-9833. doi: 10.1109/TCYB.2022.3168551. Epub 2022 Aug 18.
10
Exploratory-Phase-Free Estimation of GP Hyperparameters in Sequential Design Methods-At the Example of Bayesian Inverse Problems.序贯设计方法中高斯过程超参数的无探索阶段估计——以贝叶斯反问题为例
Front Artif Intell. 2020 Aug 13;3:52. doi: 10.3389/frai.2020.00052. eCollection 2020.

引用本文的文献

1
Generalized Nonlinear Least Squares Method for the Calibration of Complex Computer Code Using a Gaussian Process Surrogate.使用高斯过程代理校准复杂计算机代码的广义非线性最小二乘法
Entropy (Basel). 2020 Sep 4;22(9):985. doi: 10.3390/e22090985.

本文引用的文献

1
A hierarchical adaptive approach to optimal experimental design.一种用于最优实验设计的分层自适应方法。
Neural Comput. 2014 Nov;26(11):2465-92. doi: 10.1162/NECO_a_00654. Epub 2014 Aug 22.
2
Computer simulation of liquid crystals.
J Comput Aided Mol Des. 1989 Dec;3(4):335-53. doi: 10.1007/BF01532020.