• 文献检索
  • 文档翻译
  • 深度研究
  • 学术资讯
  • Suppr Zotero 插件Zotero 插件
  • 邀请有礼
  • 套餐&价格
  • 历史记录
应用&插件
Suppr Zotero 插件Zotero 插件浏览器插件Mac 客户端Windows 客户端微信小程序
定价
高级版会员购买积分包购买API积分包
服务
文献检索文档翻译深度研究API 文档MCP 服务
关于我们
关于 Suppr公司介绍联系我们用户协议隐私条款
关注我们

Suppr 超能文献

核心技术专利:CN118964589B侵权必究
粤ICP备2023148730 号-1Suppr @ 2026

文献检索

告别复杂PubMed语法,用中文像聊天一样搜索,搜遍4000万医学文献。AI智能推荐,让科研检索更轻松。

立即免费搜索

文件翻译

保留排版,准确专业,支持PDF/Word/PPT等文件格式,支持 12+语言互译。

免费翻译文档

深度研究

AI帮你快速写综述,25分钟生成高质量综述,智能提取关键信息,辅助科研写作。

立即免费体验

基于 Kullback-Leibler 散度的 L-正则化回归的序贯安全特征消除规则。

Sequential safe feature elimination rule for L-regularized regression with Kullback-Leibler divergence.

机构信息

Business School, Shandong Normal University, Jinan 250358, China.

Faculty of Mathematics and Artificial Intelligence, Qilu University of Technology (Shandong Academy of Sciences), Jinan 250353, China.

出版信息

Neural Netw. 2022 Nov;155:523-535. doi: 10.1016/j.neunet.2022.09.008. Epub 2022 Sep 13.

DOI:10.1016/j.neunet.2022.09.008
PMID:36166979
Abstract

The L-regularized regression with Kullback-Leibler divergence (KL-LR) is a popular regression technique. Although many efforts have been devoted to its efficient implementation, it remains challenging when the number of features is extremely large. In this paper, to accelerate KL-LR, we introduce a novel and fast sequential safe feature elimination rule (FER) based on its sparsity, local regularity properties, and duality theory. It takes negligible time to select and delete most redundant features before and during the training process. Only one reduced model needs to be solved, which makes the computational time shortened. To further speed up the reduced model, the Newton coordinate descent method (Newton-CDM) is chosen as a solver. The superiority of FER is safety, i.e., its solution is exactly the same as the original KL-LR. Numerical experiments on three artificial datasets, five real-world datasets, and one handwritten digit dataset demonstrate the feasibility and validity of our FER.

摘要

L 正则化的 Kullback-Leibler 散度回归(KL-LR)是一种流行的回归技术。尽管已经做了很多努力来实现其高效的实现,但当特征数量极多时,它仍然具有挑战性。在本文中,为了加速 KL-LR,我们基于其稀疏性、局部正则性和对偶理论,引入了一种新颖的、快速的序列安全特征消除规则(FER)。它在训练过程之前和期间,只需花费很少的时间即可选择和删除大多数冗余特征。只需解决一个简化模型,这使得计算时间缩短了。为了进一步加快简化模型的速度,选择牛顿坐标下降法(Newton-CDM)作为求解器。FER 的优势在于其安全性,即其解决方案与原始 KL-LR 完全相同。在三个人工数据集、五个真实数据集和一个手写数字数据集上的数值实验证明了我们的 FER 的可行性和有效性。

相似文献

1
Sequential safe feature elimination rule for L-regularized regression with Kullback-Leibler divergence.基于 Kullback-Leibler 散度的 L-正则化回归的序贯安全特征消除规则。
Neural Netw. 2022 Nov;155:523-535. doi: 10.1016/j.neunet.2022.09.008. Epub 2022 Sep 13.
2
A Safe Feature Elimination Rule for L-Regularized Logistic Regression.L-正则化逻辑回归的安全特征消除规则。
IEEE Trans Pattern Anal Mach Intell. 2022 Sep;44(9):4544-4554. doi: 10.1109/TPAMI.2021.3071138. Epub 2022 Aug 4.
3
A data assimilation framework that uses the Kullback-Leibler divergence.一种使用 KL 散度的数据同化框架。
PLoS One. 2021 Aug 26;16(8):e0256584. doi: 10.1371/journal.pone.0256584. eCollection 2021.
4
Optimistic reinforcement learning by forward Kullback-Leibler divergence optimization.基于前向 Kullback-Leibler 散度优化的乐观强化学习。
Neural Netw. 2022 Aug;152:169-180. doi: 10.1016/j.neunet.2022.04.021. Epub 2022 Apr 21.
5
Toward optimal feature and time segment selection by divergence method for EEG signals classification.基于散度方法的 EEG 信号分类的最优特征和时间段选择。
Comput Biol Med. 2018 Jun 1;97:161-170. doi: 10.1016/j.compbiomed.2018.04.022. Epub 2018 May 3.
6
Limited-memory fast gradient descent method for graph regularized nonnegative matrix factorization.基于图正则化的非负矩阵分解的有限存储快速梯度下降法。
PLoS One. 2013 Oct 21;8(10):e77162. doi: 10.1371/journal.pone.0077162. eCollection 2013.
7
A Satellite Incipient Fault Detection Method Based on Decomposed Kullback-Leibler Divergence.一种基于分解的库尔贝克-莱布勒散度的卫星早期故障检测方法。
Entropy (Basel). 2021 Sep 9;23(9):1194. doi: 10.3390/e23091194.
8
Computation of Kullback-Leibler Divergence in Bayesian Networks.贝叶斯网络中库尔贝克-莱布勒散度的计算。
Entropy (Basel). 2021 Aug 28;23(9):1122. doi: 10.3390/e23091122.
9
An α -Divergence-Based Approach for Robust Dictionary Learning.
IEEE Trans Image Process. 2019 Nov;28(11):5729-5739. doi: 10.1109/TIP.2019.2922074. Epub 2019 Jun 17.
10
The AIC criterion and symmetrizing the Kullback-Leibler divergence.赤池信息准则与对称化库尔贝克-莱布勒散度
IEEE Trans Neural Netw. 2007 Jan;18(1):97-106. doi: 10.1109/TNN.2006.882813.