• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于信息的非参数变量选择准则分析

Analysis of Information-Based Nonparametric Variable Selection Criteria.

作者信息

Łazęcka Małgorzata, Mielniczuk Jan

机构信息

Institute of Computer Science, Polish Academy of Sciences, Jana Kazimierza 5, 01-248 Warsaw, Poland.

Faculty of Mathematics and Information Science, Warsaw University of Technology, Koszykowa 75, 00-662 Warsaw, Poland.

出版信息

Entropy (Basel). 2020 Aug 31;22(9):974. doi: 10.3390/e22090974.

DOI:10.3390/e22090974
PMID:33286743
原文链接:https://pmc.ncbi.nlm.nih.gov/articles/PMC7597280/
Abstract

We consider a nonparametric Generative Tree Model and discuss a problem of selecting active predictors for the response in such scenario. We investigated two popular information-based selection criteria: Conditional Infomax Feature Extraction (CIFE) and Joint Mutual information (JMI), which are both derived as approximations of Conditional Mutual Information (CMI) criterion. We show that both criteria CIFE and JMI may exhibit different behavior from CMI, resulting in different orders in which predictors are chosen in variable selection process. Explicit formulae for CMI and its two approximations in the generative tree model are obtained. As a byproduct, we establish expressions for an entropy of a multivariate gaussian mixture and its mutual information with mixing distribution.

摘要

我们考虑一种非参数生成树模型,并讨论在这种情况下为响应选择主动预测变量的问题。我们研究了两种流行的基于信息的选择标准:条件最大信息特征提取(CIFE)和联合互信息(JMI),它们都是作为条件互信息(CMI)标准的近似值推导出来的。我们表明,CIFE和JMI这两个标准可能表现出与CMI不同的行为,导致在变量选择过程中选择预测变量的顺序不同。得到了生成树模型中CMI及其两个近似值的显式公式。作为副产品,我们建立了多元高斯混合的熵及其与混合分布的互信息的表达式。

https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/19499a91026c/entropy-22-00974-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/41e005a24db0/entropy-22-00974-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/6931a0a9c2ce/entropy-22-00974-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/3cb03505f13b/entropy-22-00974-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/48e596414e3d/entropy-22-00974-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/7698f4f51d63/entropy-22-00974-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/aa93979e9267/entropy-22-00974-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/19499a91026c/entropy-22-00974-g007.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/41e005a24db0/entropy-22-00974-g001.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/6931a0a9c2ce/entropy-22-00974-g002.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/3cb03505f13b/entropy-22-00974-g003.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/48e596414e3d/entropy-22-00974-g004.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/7698f4f51d63/entropy-22-00974-g005.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/aa93979e9267/entropy-22-00974-g006.jpg
https://cdn.ncbi.nlm.nih.gov/pmc/blobs/0a79/7597280/19499a91026c/entropy-22-00974-g007.jpg

相似文献

1
Analysis of Information-Based Nonparametric Variable Selection Criteria.基于信息的非参数变量选择准则分析
Entropy (Basel). 2020 Aug 31;22(9):974. doi: 10.3390/e22090974.
2
Simple Stopping Criteria for Information Theoretic Feature Selection.信息论特征选择的简单停止准则。
Entropy (Basel). 2019 Jan 21;21(1):99. doi: 10.3390/e21010099.
3
Information Theoretic Methods for Variable Selection-A Review.变量选择的信息论方法——综述
Entropy (Basel). 2022 Aug 4;24(8):1079. doi: 10.3390/e24081079.
4
Feature selection by optimizing a lower bound of conditional mutual information.通过优化条件互信息的下限进行特征选择。
Inf Sci (N Y). 2017 Dec;418-419:652-667. doi: 10.1016/j.ins.2017.08.036. Epub 2017 Aug 9.
5
Nonrigid image registration using conditional mutual information.基于条件互信息的非刚体图像配准。
IEEE Trans Med Imaging. 2010 Jan;29(1):19-29. doi: 10.1109/TMI.2009.2021843. Epub 2009 May 12.
6
Is mutual information adequate for feature selection in regression?互信息在回归特征选择中是否足够?
Neural Netw. 2013 Dec;48:1-7. doi: 10.1016/j.neunet.2013.07.003. Epub 2013 Jul 11.
7
Vine copula selection using mutual information for hydrological dependence modeling.基于互信息的 vine copula 选择在水文相依建模中的应用。
Environ Res. 2020 Jul;186:109604. doi: 10.1016/j.envres.2020.109604. Epub 2020 Apr 28.
8
A Novel Nonparametric Feature Selection Approach Based on Mutual Information Transfer Network.一种基于互信息传递网络的新型非参数特征选择方法。
Entropy (Basel). 2022 Sep 7;24(9):1255. doi: 10.3390/e24091255.
9
Adaptation of Partial Mutual Information from Mixed Embedding to Discrete-Valued Time Series.部分互信息从混合嵌入到离散值时间序列的适配
Entropy (Basel). 2022 Oct 22;24(11):1505. doi: 10.3390/e24111505.
10
Rolling Bearing Diagnosis Based on Composite Multiscale Weighted Permutation Entropy.基于复合多尺度加权排列熵的滚动轴承诊断
Entropy (Basel). 2018 Oct 24;20(11):821. doi: 10.3390/e20110821.

引用本文的文献

1
Information Theoretic Methods for Variable Selection-A Review.变量选择的信息论方法——综述
Entropy (Basel). 2022 Aug 4;24(8):1079. doi: 10.3390/e24081079.
2
Nonparametric Statistical Inference with an Emphasis on Information-Theoretic Methods.非参数统计推断,重点在于信息论方法。
Entropy (Basel). 2022 Apr 15;24(4):553. doi: 10.3390/e24040553.

本文引用的文献

1
Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy.基于互信息的特征选择:最大依赖、最大相关和最小冗余准则。
IEEE Trans Pattern Anal Mach Intell. 2005 Aug;27(8):1226-38. doi: 10.1109/TPAMI.2005.159.