Suppr超能文献

归一化互信息特征选择

Normalized mutual information feature selection.

作者信息

Estévez Pablo A, Tesmer Michel, Perez Claudio A, Zurada Jacek M

机构信息

Department of Electrical Engineering, University of Chile, Casilla 412-3, Santiago 8370451, Chile.

出版信息

IEEE Trans Neural Netw. 2009 Feb;20(2):189-201. doi: 10.1109/TNN.2008.2005601. Epub 2009 Jan 13.

Abstract

A filter method of feature selection based on mutual information, called normalized mutual information feature selection (NMIFS), is presented. NMIFS is an enhancement over Battiti's MIFS, MIFS-U, and mRMR methods. The average normalized mutual information is proposed as a measure of redundancy among features. NMIFS outperformed MIFS, MIFS-U, and mRMR on several artificial and benchmark data sets without requiring a user-defined parameter. In addition, NMIFS is combined with a genetic algorithm to form a hybrid filter/wrapper method called GAMIFS. This includes an initialization procedure and a mutation operator based on NMIFS to speed up the convergence of the genetic algorithm. GAMIFS overcomes the limitations of incremental search algorithms that are unable to find dependencies between groups of features.

摘要

提出了一种基于互信息的特征选择滤波方法,称为归一化互信息特征选择(NMIFS)。NMIFS是对Battiti的MIFS、MIFS-U和mRMR方法的改进。提出了平均归一化互信息作为特征间冗余度的度量。在几个人工和基准数据集上,NMIFS的性能优于MIFS、MIFS-U和mRMR,且无需用户定义参数。此外,NMIFS与遗传算法相结合,形成了一种称为GAMIFS的混合滤波/包装方法。这包括一个初始化过程和一个基于NMIFS的变异算子,以加速遗传算法的收敛。GAMIFS克服了增量搜索算法无法找到特征组之间依赖关系的局限性。

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验