• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

通过多变量稀疏组套索学习进行元特征选择以实现自动超参数配置推荐

Metafeature Selection via Multivariate Sparse-Group Lasso Learning for Automatic Hyperparameter Configuration Recommendation.

作者信息

Deng Liping, Chen Wen-Sheng, Xiao Mingqing

出版信息

IEEE Trans Neural Netw Learn Syst. 2024 Sep;35(9):12540-12552. doi: 10.1109/TNNLS.2023.3263506. Epub 2024 Sep 3.

DOI:10.1109/TNNLS.2023.3263506
PMID:37037247
Abstract

The performance of classification algorithms is mainly governed by the hyperparameter settings deployed in applications, and the search for desirable hyperparameter configurations usually is quite challenging due to the complexity of datasets. Metafeatures are a group of measures that characterize the underlying dataset from various aspects, and the corresponding recommendation algorithm fully relies on the appropriate selection of metafeatures. Metalearning (MtL), aiming to improve the learning algorithm itself, requires development in integrating features, models, and algorithm learning to accomplish its goal. In this article, we develop a multivariate sparse-group Lasso (SGLasso) model embedded with MtL capacity in recommending suitable configurations via learning. The main idea is to select the principal metafeatures by removing those redundant or irregular ones, promoting both efficiency and performance in the hyperparameter configuration recommendation. To be specific, we first extract the metafeatures and classification performance of a set of configurations from the collection of historical datasets, and then, a metaregression task is established through SGLasso to capture the main characteristics of the underlying relationship between metafeatures and historical performance. For a new dataset, the classification performance of configurations can be estimated through the selected metafeatures so that the configuration with the highest predictive performance in terms of the new dataset can be generated. Furthermore, a general MtL architecture combined with our model is developed. Extensive experiments are conducted on 136 UCI datasets, demonstrating the effectiveness of the proposed approach. The empirical results on the well-known SVM show that our model can effectively recommend suitable configurations and outperform the existing MtL-based methods and the well-known search-based algorithms, such as random search, Bayesian optimization, and Hyperband.

摘要

分类算法的性能主要由应用中部署的超参数设置决定,由于数据集的复杂性,寻找理想的超参数配置通常具有很大挑战性。元特征是从各个方面表征基础数据集的一组度量,相应的推荐算法完全依赖于元特征的适当选择。元学习(MtL)旨在改进学习算法本身,需要在特征整合、模型和算法学习的整合方面取得进展以实现其目标。在本文中,我们开发了一种嵌入元学习能力的多元稀疏组套索(SGLasso)模型,通过学习来推荐合适的配置。主要思想是通过去除那些冗余或不规则的元特征来选择主要元特征,提高超参数配置推荐的效率和性能。具体而言,我们首先从历史数据集集合中提取一组配置的元特征和分类性能,然后通过SGLasso建立一个元回归任务,以捕捉元特征与历史性能之间潜在关系的主要特征。对于新数据集,可以通过所选元特征估计配置的分类性能,从而生成在新数据集方面具有最高预测性能的配置。此外,还开发了一种结合我们模型的通用元学习架构。在136个UCI数据集上进行了广泛实验,证明了所提方法的有效性。在著名的支持向量机上的实证结果表明,我们的模型可以有效地推荐合适的配置,并且优于现有的基于元学习的方法以及著名的基于搜索的算法,如随机搜索、贝叶斯优化和超参数搜索。

相似文献

1
Metafeature Selection via Multivariate Sparse-Group Lasso Learning for Automatic Hyperparameter Configuration Recommendation.通过多变量稀疏组套索学习进行元特征选择以实现自动超参数配置推荐
IEEE Trans Neural Netw Learn Syst. 2024 Sep;35(9):12540-12552. doi: 10.1109/TNNLS.2023.3263506. Epub 2024 Sep 3.
2
Hyperparameter Recommendation Integrated With Convolutional Neural Network.与卷积神经网络集成的超参数推荐
IEEE Trans Neural Netw Learn Syst. 2025 Jun;36(6):11121-11134. doi: 10.1109/TNNLS.2024.3476439.
3
A New Automatic Hyperparameter Recommendation Approach Under Low-Rank Tensor Completion e Framework.基于低秩张量补全框架的新的自动超参数推荐方法。
IEEE Trans Pattern Anal Mach Intell. 2023 Apr;45(4):4038-4050. doi: 10.1109/TPAMI.2022.3195658. Epub 2023 Mar 7.
4
Optimizing Machine Learning Algorithms for Landslide Susceptibility Mapping along the Karakoram Highway, Gilgit Baltistan, Pakistan: A Comparative Study of Baseline, Bayesian, and Metaheuristic Hyperparameter Optimization Techniques.优化巴基斯坦吉尔吉特-巴尔蒂斯坦喀喇昆仑公路沿线滑坡易发性制图的机器学习算法:基线、贝叶斯和元启发式超参数优化技术的比较研究
Sensors (Basel). 2023 Aug 1;23(15):6843. doi: 10.3390/s23156843.
5
Broad Multitask Learning System With Group Sparse Regularization.具有组稀疏正则化的广义多任务学习系统
IEEE Trans Neural Netw Learn Syst. 2025 May;36(5):8265-8278. doi: 10.1109/TNNLS.2024.3416191. Epub 2025 May 2.
6
Particle swarm optimization-based automatic parameter selection for deep neural networks and its applications in large-scale and high-dimensional data.基于粒子群优化的深度神经网络自动参数选择及其在大规模和高维数据中的应用。
PLoS One. 2017 Dec 13;12(12):e0188746. doi: 10.1371/journal.pone.0188746. eCollection 2017.
7
Machine learning algorithms for outcome prediction in (chemo)radiotherapy: An empirical comparison of classifiers.机器学习算法在(放化疗)治疗结果预测中的应用:分类器的实证比较。
Med Phys. 2018 Jul;45(7):3449-3459. doi: 10.1002/mp.12967. Epub 2018 Jun 13.
8
Machine Learning-Based Boosted Regression Ensemble Combined with Hyperparameter Tuning for Optimal Adaptive Learning.基于机器学习的增强回归集成与超参数调整相结合,实现最优自适应学习。
Sensors (Basel). 2022 May 16;22(10):3776. doi: 10.3390/s22103776.
9
A Unified Framework for Automatic Distributed Active Learning.自动分布式主动学习的统一框架。
IEEE Trans Pattern Anal Mach Intell. 2022 Dec;44(12):9774-9786. doi: 10.1109/TPAMI.2021.3129793. Epub 2022 Nov 7.
10
Generalized SMO algorithm for SVM-based multitask learning.基于 SVM 的多任务学习的广义 SMO 算法。
IEEE Trans Neural Netw Learn Syst. 2012 Jun;23(6):997-1003. doi: 10.1109/TNNLS.2012.2187307.