Suppr超能文献

套索网络:具有特征稀疏性的神经网络。

LassoNet: Neural Networks with Feature Sparsity.

作者信息

Lemhadri Ismael, Ruan Feng, Tibshirani Robert

机构信息

Stanford University.

出版信息

Proc Mach Learn Res. 2021 Apr;130:10-18.

Abstract

Much work has been done recently to make neural networks more interpretable, and one approach is to arrange for the network to use only a subset of the available features. In linear models, Lasso (or -regularized) regression assigns zero weights to the most irrelevant or redundant features, and is widely used in data science. However the Lasso only applies to linear models. Here we introduce LassoNet, a neural network framework with global feature selection. Our approach achieves feature sparsity by allowing a feature to participate in a hidden unit only if its linear representative is active. Unlike other approaches to feature selection for neural nets, our method uses a modified objective function with constraints, and so integrates feature selection with the parameter learning directly. As a result, it delivers an entire regularization path of solutions with a range of feature sparsity. In experiments with real and simulated data, LassoNet significantly outperforms state-of-the-art methods for feature selection and regression. The LassoNet method uses projected proximal gradient descent, and generalizes directly to deep networks. It can be implemented by adding just a few lines of code to a standard neural network.

摘要

最近人们做了很多工作来使神经网络更具可解释性,一种方法是让网络仅使用可用特征的一个子集。在线性模型中,套索(或L1正则化)回归将零权重赋给最不相关或冗余的特征,并且在数据科学中被广泛使用。然而,套索仅适用于线性模型。在这里,我们引入LassoNet,一个具有全局特征选择的神经网络框架。我们的方法通过仅在其线性表示活跃时才允许一个特征参与隐藏单元来实现特征稀疏性。与神经网络特征选择的其他方法不同,我们的方法使用带有约束的修改目标函数,因此直接将特征选择与参数学习集成在一起。结果,它提供了具有一系列特征稀疏性的整个正则化解决方案路径。在真实数据和模拟数据的实验中,LassoNet在特征选择和回归方面显著优于当前最先进的方法。LassoNet方法使用投影近端梯度下降,并且可以直接推广到深度网络。它可以通过在标准神经网络中添加几行代码来实现。

相似文献

1
LassoNet: Neural Networks with Feature Sparsity.
Proc Mach Learn Res. 2021 Apr;130:10-18.
2
Feature flow regularization: Improving structured sparsity in deep neural networks.
Neural Netw. 2023 Apr;161:598-613. doi: 10.1016/j.neunet.2023.02.013. Epub 2023 Feb 13.
3
Combined-task deep network based on LassoNet feature selection for predicting the comorbidities of acute coronary syndrome.
Comput Biol Med. 2024 Mar;170:107992. doi: 10.1016/j.compbiomed.2024.107992. Epub 2024 Jan 15.
4
LassoNet: Deep Lasso-Selection of 3D Point Clouds.
IEEE Trans Vis Comput Graph. 2020 Jan;26(1):195-204. doi: 10.1109/TVCG.2019.2934332. Epub 2024 Mar 12.
5
Nonlinear Feature Selection Neural Network via Structured Sparse Regularization.
IEEE Trans Neural Netw Learn Syst. 2023 Nov;34(11):9493-9505. doi: 10.1109/TNNLS.2022.3209716. Epub 2023 Oct 27.
6
Biobjective gradient descent for feature selection on high dimension, low sample size data.
PLoS One. 2024 Jul 18;19(7):e0305654. doi: 10.1371/journal.pone.0305654. eCollection 2024.
7
A universal deep learning approach for modeling the flow of patients under different severities.
Comput Methods Programs Biomed. 2018 Feb;154:191-203. doi: 10.1016/j.cmpb.2017.11.003. Epub 2017 Nov 7.
8
Boosted network classifiers for local feature selection.
IEEE Trans Neural Netw Learn Syst. 2012 Nov;23(11):1767-78. doi: 10.1109/TNNLS.2012.2214057.
9
Transformed ℓ regularization for learning sparse deep neural networks.
Neural Netw. 2019 Nov;119:286-298. doi: 10.1016/j.neunet.2019.08.015. Epub 2019 Aug 27.
10
Sparse Manifold-Regularized Neural Networks for Polarimetric SAR Terrain Classification.
IEEE Trans Neural Netw Learn Syst. 2020 Aug;31(8):3007-3016. doi: 10.1109/TNNLS.2019.2935027. Epub 2019 Sep 12.

引用本文的文献

2
Machine Learning-Driven Prediction of One-Year Readmission in HFrEF Patients: The Key Role of Inflammation.
Clin Interv Aging. 2025 Jul 24;20:1071-1084. doi: 10.2147/CIA.S528442. eCollection 2025.
3
Semi-supervised data-integrated feature importance enhances performance and interpretability of biological classification tasks.
Bioinformatics. 2025 Jul 1;41(Supplement_1):i373-i381. doi: 10.1093/bioinformatics/btaf190.
4
Multi-task genomic prediction using gated residual variable selection neural networks.
BMC Bioinformatics. 2025 Jul 7;26(1):167. doi: 10.1186/s12859-025-06188-z.
5
Variable Selection for Multivariate Failure Time Data via Regularized Sparse-Input Neural Network.
Bioengineering (Basel). 2025 May 31;12(6):596. doi: 10.3390/bioengineering12060596.
6
Artificial intelligence-derived retinal age gap as a marker for reproductive aging in women.
NPJ Digit Med. 2025 Jun 16;8(1):367. doi: 10.1038/s41746-025-01699-8.
7
Enhancing ERα-targeted compound efficacy in breast cancer threapy with ExplainableAI and GeneticAlgorithm.
PLoS One. 2025 May 20;20(5):e0319673. doi: 10.1371/journal.pone.0319673. eCollection 2025.
10
Tabular deep learning: a comparative study applied to multi-task genome-wide prediction.
BMC Bioinformatics. 2024 Oct 4;25(1):322. doi: 10.1186/s12859-024-05940-1.

本文引用的文献

1
Learning interactions via hierarchical group-lasso regularization.
J Comput Graph Stat. 2015;24(3):627-654. doi: 10.1080/10618600.2014.938812. Epub 2015 Sep 16.
2
An experimental comparison of feature selection methods on two-class biomedical datasets.
Comput Biol Med. 2015 Nov 1;66:1-10. doi: 10.1016/j.compbiomed.2015.08.010. Epub 2015 Aug 24.
3
Self-Organizing Feature Maps Identify Proteins Critical to Learning in a Mouse Model of Down Syndrome.
PLoS One. 2015 Jun 25;10(6):e0129126. doi: 10.1371/journal.pone.0129126. eCollection 2015.
4
A SIGNIFICANCE TEST FOR THE LASSO.
Ann Stat. 2014 Apr;42(2):413-468. doi: 10.1214/13-AOS1175.
5
High-dimensional feature selection by feature-wise kernelized Lasso.
Neural Comput. 2014 Jan;26(1):185-207. doi: 10.1162/NECO_a_00537. Epub 2013 Oct 8.
8
Proteomic applications for the early detection of cancer.
Nat Rev Cancer. 2003 Apr;3(4):267-75. doi: 10.1038/nrc1043.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验