• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

使用装袋法和提升法对在线特征选择的扩展。

Extensions to Online Feature Selection Using Bagging and Boosting.

作者信息

Ditzler Gregory, LaBarck Joseph, Ritchie James, Rosen Gail, Polikar Robi

出版信息

IEEE Trans Neural Netw Learn Syst. 2018 Sep;29(9):4504-4509. doi: 10.1109/TNNLS.2017.2746107. Epub 2017 Oct 11.

DOI:10.1109/TNNLS.2017.2746107
PMID:29028210
Abstract

Feature subset selection can be used to sieve through large volumes of data and discover the most informative subset of variables for a particular learning problem. Yet, due to memory and other resource constraints (e.g., CPU availability), many of the state-of-the-art feature subset selection methods cannot be extended to high dimensional data, or data sets with an extremely large volume of instances. In this brief, we extend online feature selection (OFS), a recently introduced approach that uses partial feature information, by developing an ensemble of online linear models to make predictions. The OFS approach employs a linear model as the base classifier, which allows the $l_{0}$ -norm of the parameter vector to be constrained to perform feature selection leading to sparse linear models. We demonstrate that the proposed ensemble model typically yields a smaller error rate than any single linear model, while maintaining the same level of sparsity and complexity at the time of testing.

摘要

特征子集选择可用于筛选大量数据,并为特定的学习问题发现信息量最大的变量子集。然而,由于内存和其他资源限制(例如CPU可用性),许多最先进的特征子集选择方法无法扩展到高维数据或具有极大量实例的数据集。在本简报中,我们通过开发在线线性模型集成来进行预测,扩展了在线特征选择(OFS),这是一种最近引入的使用部分特征信息的方法。OFS方法采用线性模型作为基础分类器,它允许对参数向量的$l_{0}$范数进行约束,以执行特征选择,从而得到稀疏线性模型。我们证明,所提出的集成模型通常比任何单个线性模型产生更小的错误率,同时在测试时保持相同水平的稀疏性和复杂度。

相似文献

1
Extensions to Online Feature Selection Using Bagging and Boosting.使用装袋法和提升法对在线特征选择的扩展。
IEEE Trans Neural Netw Learn Syst. 2018 Sep;29(9):4504-4509. doi: 10.1109/TNNLS.2017.2746107. Epub 2017 Oct 11.
2
Pairwise Constraint-Guided Sparse Learning for Feature Selection.基于成对约束的稀疏学习特征选择。
IEEE Trans Cybern. 2016 Jan;46(1):298-310. doi: 10.1109/TCYB.2015.2401733. Epub 2015 Jul 6.
3
A Sequential Learning Approach for Scaling Up Filter-Based Feature Subset Selection.基于序贯学习的过滤式特征子集选择方法的扩展。
IEEE Trans Neural Netw Learn Syst. 2018 Jun;29(6):2530-2544. doi: 10.1109/TNNLS.2017.2697407. Epub 2017 May 11.
4
Support vector machines with constraints for sparsity in the primal parameters.对原始参数稀疏性有约束的支持向量机。
IEEE Trans Neural Netw. 2011 Aug;22(8):1269-83. doi: 10.1109/TNN.2011.2148727. Epub 2011 Jul 5.
5
Non-Negative Spectral Learning and Sparse Regression-Based Dual-Graph Regularized Feature Selection.基于非负谱学习和稀疏回归的对偶图正则化特征选择。
IEEE Trans Cybern. 2018 Feb;48(2):793-806. doi: 10.1109/TCYB.2017.2657007. Epub 2017 Mar 6.
6
Ensemble Feature Learning of Genomic Data Using Support Vector Machine.使用支持向量机的基因组数据集成特征学习
PLoS One. 2016 Jun 15;11(6):e0157330. doi: 10.1371/journal.pone.0157330. eCollection 2016.
7
Ensemble Merit Merge Feature Selection for Enhanced Multinomial Classification in Alzheimer's Dementia.用于增强阿尔茨海默病痴呆症多项分类的集成优点合并特征选择
Comput Math Methods Med. 2015;2015:676129. doi: 10.1155/2015/676129. Epub 2015 Oct 20.
8
Constructing ensembles of classifiers by means of weighted instance selection.通过加权实例选择构建分类器集成。
IEEE Trans Neural Netw. 2009 Feb;20(2):258-77. doi: 10.1109/TNN.2008.2005496. Epub 2009 Jan 27.
9
Randomized boosting with multivariable base-learners for high-dimensional variable selection and prediction.基于多变量基学习器的随机boosting 算法在高维变量选择和预测中的应用。
BMC Bioinformatics. 2021 Sep 16;22(1):441. doi: 10.1186/s12859-021-04340-z.
10
Joint embedding learning and sparse regression: a framework for unsupervised feature selection.联合嵌入学习和稀疏回归:一种无监督特征选择的框架。
IEEE Trans Cybern. 2014 Jun;44(6):793-804. doi: 10.1109/TCYB.2013.2272642. Epub 2013 Jul 22.

引用本文的文献

1
An advanced machine learning method for simultaneous breast cancer risk prediction and risk ranking in Chinese population: A prospective cohort and modeling study.一种用于中国人群乳腺癌风险预测和风险排序的先进机器学习方法:前瞻性队列和建模研究。
Chin Med J (Engl). 2024 Sep 5;137(17):2084-2091. doi: 10.1097/CM9.0000000000002891. Epub 2024 Feb 26.
2
Survival Prediction Model for Patients with Hepatocellular Carcinoma and Extrahepatic Metastasis Based on XGBoost Algorithm.基于XGBoost算法的肝细胞癌合并肝外转移患者生存预测模型
J Hepatocell Carcinoma. 2023 Dec 13;10:2251-2263. doi: 10.2147/JHC.S429903. eCollection 2023.
3
Predictive model for the 5-year survival status of osteosarcoma patients based on the SEER database and XGBoost algorithm.
基于 SEER 数据库和 XGBoost 算法的骨肉瘤患者 5 年生存状态预测模型。
Sci Rep. 2021 Mar 10;11(1):5542. doi: 10.1038/s41598-021-85223-4.
4
Dropout Deep Belief Network Based Chinese Ancient Ceramic Non-Destructive Identification.基于 dropout 深度置信网络的中国古代陶瓷无损鉴别。
Sensors (Basel). 2021 Feb 12;21(4):1318. doi: 10.3390/s21041318.
5
An alternative approach to dimension reduction for pareto distributed data: a case study.帕累托分布数据降维的另一种方法:一个案例研究。
J Big Data. 2021;8(1):39. doi: 10.1186/s40537-021-00428-8. Epub 2021 Feb 25.