• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于注意力的随机森林和污染模型。

Attention-based random forest and contamination model.

机构信息

Peter the Great St. Petersburg Polytechnic University, St. Petersburg, Russia.

出版信息

Neural Netw. 2022 Oct;154:346-359. doi: 10.1016/j.neunet.2022.07.029. Epub 2022 Aug 1.

DOI:10.1016/j.neunet.2022.07.029
PMID:35944366
Abstract

A new approach called ABRF (the attention-based random forest) and its modifications for applying the attention mechanism to the random forest (RF) for regression and classification are proposed. The main idea behind the proposed ABRF models is to assign attention weights with trainable parameters to decision trees in a specific way. The attention weights depend on the distance between an instance, which falls into a corresponding leaf of a tree, and training instances, which fall in the same leaf. This idea stems from representation of the Nadaraya-Watson kernel regression in the form of a RF. Three modifications of the general approach are proposed. The first one is based on applying the Huber's contamination model and on computing the attention weights by solving quadratic or linear optimization problems. The second and the third modifications use the gradient-based algorithms for computing an extended set of the attention trainable parameters. Numerical experiments with various regression and classification datasets illustrate the proposed method. The code implementing the approach is publicly available.

摘要

提出了一种新的方法,称为基于注意力的随机森林(ABRF)及其对随机森林(RF)的修改,以便将注意力机制应用于回归和分类。所提出的 ABRF 模型的主要思想是以特定的方式为决策树分配具有可训练参数的注意力权重。注意力权重取决于实例之间的距离,该实例落入树的相应叶子中,而训练实例则落入同一叶子中。这个想法源于以 RF 形式表示的 Nadaraya-Watson 核回归。提出了三种通用方法的修改。第一种方法基于应用 Huber 的污染模型,并通过求解二次或线性优化问题来计算注意力权重。第二和第三种修改使用基于梯度的算法来计算可训练注意力参数的扩展集。使用各种回归和分类数据集的数值实验说明了所提出的方法。实现该方法的代码是公开的。

相似文献

1
Attention-based random forest and contamination model.基于注意力的随机森林和污染模型。
Neural Netw. 2022 Oct;154:346-359. doi: 10.1016/j.neunet.2022.07.029. Epub 2022 Aug 1.
2
Application of the Nadaraya-Watson estimator based attention mechanism to the field of predictive maintenance.基于纳德拉亚 - 沃森估计器的注意力机制在预测性维护领域的应用。
MethodsX. 2024 May 17;12:102754. doi: 10.1016/j.mex.2024.102754. eCollection 2024 Jun.
3
Oblique and rotation double random forest.倾斜和旋转双重随机森林。
Neural Netw. 2022 Sep;153:496-517. doi: 10.1016/j.neunet.2022.06.012. Epub 2022 Jun 18.
4
Heterogeneous multiple kernel learning for breast cancer outcome evaluation.基于异质多核学习的乳腺癌预后评估
BMC Bioinformatics. 2020 Apr 23;21(1):155. doi: 10.1186/s12859-020-3483-0.
5
Optimizing neural networks for medical data sets: A case study on neonatal apnea prediction.优化神经网络在医学数据集上的应用:以新生儿呼吸暂停预测为例的研究
Artif Intell Med. 2019 Jul;98:59-76. doi: 10.1016/j.artmed.2019.07.008. Epub 2019 Jul 25.
6
A novel approach to build accurate and diverse decision tree forest.一种构建准确且多样的决策树森林的新方法。
Evol Intell. 2022;15(1):439-453. doi: 10.1007/s12065-020-00519-0. Epub 2021 Jan 3.
7
GSEA-SDBE: A gene selection method for breast cancer classification based on GSEA and analyzing differences in performance metrics.GSEA-SDBE:一种基于基因集富集分析(GSEA)并分析性能指标差异的乳腺癌分类基因选择方法。
PLoS One. 2022 Apr 26;17(4):e0263171. doi: 10.1371/journal.pone.0263171. eCollection 2022.
8
A Comparative Assessment of the Influences of Human Impacts on Soil Cd Concentrations Based on Stepwise Linear Regression, Classification and Regression Tree, and Random Forest Models.基于逐步线性回归、分类与回归树以及随机森林模型对人类活动对土壤镉浓度影响的比较评估
PLoS One. 2016 Mar 10;11(3):e0151131. doi: 10.1371/journal.pone.0151131. eCollection 2016.
9
Application of rotation forest with decision trees as base classifier and a novel ensemble model in spatial modeling of groundwater potential.旋转森林与决策树作为基分类器在地下水潜力空间建模中的应用及一种新的集成模型。
Environ Monit Assess. 2019 Mar 27;191(4):248. doi: 10.1007/s10661-019-7362-y.
10
Classification and Explanation for Intrusion Detection System Based on Ensemble Trees and SHAP Method.基于集成树和 SHAP 方法的入侵检测系统分类与解释。
Sensors (Basel). 2022 Feb 3;22(3):1154. doi: 10.3390/s22031154.

引用本文的文献

1
Immunophenotyping identifies key immune biomarkers for coronary artery disease through machine learning.免疫表型分析通过机器学习识别冠状动脉疾病的关键免疫生物标志物。
PLoS One. 2025 Aug 26;20(8):e0328811. doi: 10.1371/journal.pone.0328811. eCollection 2025.
2
A Machine Learning-Based Diagnostic Nomogram for Moyamoya Disease: The Validation of Hypoxia-Immune Gene Signatures.一种基于机器学习的烟雾病诊断列线图:缺氧免疫基因特征的验证
Bioengineering (Basel). 2025 May 27;12(6):577. doi: 10.3390/bioengineering12060577.
3
Dynamic Gene Attention Focus (DyGAF): Enhancing Biomarker Identification Through Dual-Model Attention Networks.
动态基因注意力焦点(DyGAF):通过双模型注意力网络增强生物标志物识别
Bioinform Biol Insights. 2025 Mar 27;19:11779322251325390. doi: 10.1177/11779322251325390. eCollection 2025.
4
Utilizing SMOTE-TomekLink and machine learning to construct a predictive model for elderly medical and daily care services demand.利用SMOTE-TomekLink和机器学习构建老年医疗和日常护理服务需求预测模型。
Sci Rep. 2025 Mar 11;15(1):8446. doi: 10.1038/s41598-025-92722-1.
5
Screening of mitochondrial-related biomarkers connected with immune infiltration for acute respiratory distress syndrome through WGCNA and machine learning.通过加权基因共表达网络分析(WGCNA)和机器学习筛选与急性呼吸窘迫综合征免疫浸润相关的线粒体生物标志物
Medicine (Baltimore). 2025 Mar 7;104(10):e41497. doi: 10.1097/MD.0000000000041497.
6
Applying machine learning algorithms to develop a survival prediction model for lung adenocarcinoma based on genes related to fatty acid metabolism.应用机器学习算法基于与脂肪酸代谢相关的基因开发肺腺癌生存预测模型。
Front Pharmacol. 2023 Oct 17;14:1260742. doi: 10.3389/fphar.2023.1260742. eCollection 2023.
7
Exploration of mA methylation regulators as epigenetic targets for immunotherapy in advanced sepsis.探讨 mA 甲基化调节剂作为晚期脓毒症免疫治疗的表观遗传靶点。
BMC Bioinformatics. 2023 Jun 17;24(1):257. doi: 10.1186/s12859-023-05379-w.