• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于约简的特征加权与排序来修剪决策规则

Pruning Decision Rules by Reduct-Based Weighting and Ranking of Features.

作者信息

Stańczyk Urszula

机构信息

Department of Computer Graphics, Vision and Digital Systems, Silesian University of Technology, Akademicka 2A, 44-100 Gliwice, Poland.

出版信息

Entropy (Basel). 2022 Nov 3;24(11):1602. doi: 10.3390/e24111602.

DOI:10.3390/e24111602
PMID:36359692
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC9689530/
Abstract

Methods and techniques of feature selection support expert domain knowledge in the search for attributes, which are the most important for a task. These approaches can also be used in the process of closer tailoring of the obtained solutions when dimensionality reduction is aimed not only at variables but also at learners. The paper reports on research where attribute rankings were employed to filter induced decision rules. The rankings were constructed through the proposed weighting factor based on the concept of decision reducts-a feature reduction mechanism embedded in the rough set theory. Classical rough sets operate only in discrete input space by indiscernibility relation. Replacing it with dominance enables processing real-valued data. Decision reducts were found for both numeric and discrete attributes, transformed by selected discretisation approaches. The calculated ranking scores were used to control the selection of decision rules. The performance of the resulting rule classifiers was observed for the entire range of rejected variables, for decision rules with conditions on continuous values, discretised conditions, and also inferred from discrete data. The predictive powers were analysed and compared to detect existing trends. The experiments show that for all variants of the rule sets, not only was dimensionality reduction possible, but also predictions were improved, which validated the proposed methodology.

摘要

特征选择的方法和技术在寻找对任务最为重要的属性时支持专家领域知识。当降维不仅针对变量而且针对学习器时,这些方法也可用于对所得解决方案进行更精细调整的过程。本文报道了一项研究,其中使用属性排名来过滤归纳出的决策规则。这些排名是通过基于决策约简概念提出的加权因子构建的,决策约简是粗糙集理论中嵌入的一种特征约简机制。经典粗糙集仅通过不可分辨关系在离散输入空间中运行。用支配关系取代它可以处理实值数据。通过选定的离散化方法对数值属性和离散属性都找到了决策约简。计算出的排名分数用于控制决策规则的选择。对于被拒绝变量的整个范围、具有连续值条件的决策规则、离散条件的决策规则以及从离散数据推断出的决策规则,都观察了所得规则分类器的性能。分析并比较了预测能力以检测现有趋势。实验表明,对于规则集的所有变体,不仅可以进行降维,而且预测得到了改进,这验证了所提出的方法。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/3d18954d6957/entropy-24-01602-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/fd6e70519800/entropy-24-01602-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/2deb3d2aa931/entropy-24-01602-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/492b2d2295dd/entropy-24-01602-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/3b87dd41112c/entropy-24-01602-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/3c0e5082b4f9/entropy-24-01602-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/3d18954d6957/entropy-24-01602-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/fd6e70519800/entropy-24-01602-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/2deb3d2aa931/entropy-24-01602-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/492b2d2295dd/entropy-24-01602-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/3b87dd41112c/entropy-24-01602-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/3c0e5082b4f9/entropy-24-01602-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/04aa/9689530/3d18954d6957/entropy-24-01602-g006.jpg

相似文献

1
Pruning Decision Rules by Reduct-Based Weighting and Ranking of Features.基于约简的特征加权与排序来修剪决策规则
Entropy (Basel). 2022 Nov 3;24(11):1602. doi: 10.3390/e24111602.
2
Discretisation of conditions in decision rules induced for continuous data.决策规则中条件的离散化,用于连续数据。
PLoS One. 2020 Apr 22;15(4):e0231788. doi: 10.1371/journal.pone.0231788. eCollection 2020.
3
Improved EAV-Based Algorithm for Decision Rules Construction.基于改进电子鼻的决策规则构建算法
Entropy (Basel). 2023 Jan 2;25(1):91. doi: 10.3390/e25010091.
4
Rough set feature selection and rule induction for prediction of malignancy degree in brain glioma.用于预测脑胶质瘤恶性程度的粗糙集特征选择与规则归纳
Comput Methods Programs Biomed. 2006 Aug;83(2):147-56. doi: 10.1016/j.cmpb.2006.06.007. Epub 2006 Aug 8.
5
δ-Cut decision-theoretic rough set approach: model and attribute reductions.δ-截决策理论粗糙集方法:模型与属性约简
ScientificWorldJournal. 2014;2014:382439. doi: 10.1155/2014/382439. Epub 2014 Jul 22.
6
Importance of Characteristic Features and Their Form for Data Exploration.数据探索中特征及其形式的重要性。
Entropy (Basel). 2024 May 6;26(5):404. doi: 10.3390/e26050404.
7
Fuzzy-Rough Simultaneous Attribute Selection and Feature Extraction Algorithm.模糊粗糙同时属性选择与特征提取算法。
IEEE Trans Cybern. 2013 Aug;43(4):1166-77. doi: 10.1109/TSMCB.2012.2225832.
8
Rough set theory based prognostic classification models for hospice referral.基于粗糙集理论的临终关怀转诊预后分类模型。
BMC Med Inform Decis Mak. 2015 Nov 25;15:98. doi: 10.1186/s12911-015-0216-9.
9
A heuristic method for discovering biomarker candidates based on rough set theory.一种基于粗糙集理论发现生物标志物候选物的启发式方法。
Bioinformation. 2011;6(5):200-3. doi: 10.6026/97320630006200. Epub 2011 May 26.
10
Evolutionary and Neural Computing Based Decision Support System for Disease Diagnosis from Clinical Data Sets in Medical Practice.基于进化和神经计算的决策支持系统,用于从医学实践中的临床数据集诊断疾病。
J Med Syst. 2017 Sep 27;41(11):178. doi: 10.1007/s10916-017-0823-3.

引用本文的文献

1
Exploiting Data Distribution: A Multi-Ranking Approach.利用数据分布:一种多排序方法。
Entropy (Basel). 2025 Mar 7;27(3):278. doi: 10.3390/e27030278.
2
Importance of Characteristic Features and Their Form for Data Exploration.数据探索中特征及其形式的重要性。
Entropy (Basel). 2024 May 6;26(5):404. doi: 10.3390/e26050404.
3
Kernel Partial Least Squares Feature Selection Based on Maximum Weight Minimum Redundancy.基于最大权重最小冗余的核偏最小二乘特征选择

本文引用的文献

1
F*: an interpretable transformation of the F-measure.F*:F 度量的一种可解释变换。
Mach Learn. 2021;110(3):451-456. doi: 10.1007/s10994-021-05964-1. Epub 2021 Mar 15.
2
Discretisation of conditions in decision rules induced for continuous data.决策规则中条件的离散化,用于连续数据。
PLoS One. 2020 Apr 22;15(4):e0231788. doi: 10.1371/journal.pone.0231788. eCollection 2020.
Entropy (Basel). 2023 Feb 10;25(2):325. doi: 10.3390/e25020325.