Suppr超能文献

高维稀疏降秩Huber回归

Sparse Reduced Rank Huber Regression in High Dimensions.

作者信息

Tan Kean Ming, Sun Qiang, Witten Daniela

机构信息

Department of Statistics, University of Michigan, Ann Arbor, MI.

Department of Statistical Sciences, University of Toronto, Toronto, ON, Canada.

出版信息

J Am Stat Assoc. 2023;118(544):2383-2393. doi: 10.1080/01621459.2022.2050243. Epub 2022 Apr 15.

Abstract

We propose a sparse reduced rank Huber regression for analyzing large and complex high-dimensional data with heavy-tailed random noise. The proposed method is based on a convex relaxation of a rank- and sparsity-constrained nonconvex optimization problem, which is then solved using a block coordinate descent and an alternating direction method of multipliers algorithm. We establish nonasymptotic estimation error bounds under both Frobenius and nuclear norms in the high-dimensional setting. This is a major contribution over existing results in reduced rank regression, which mainly focus on rank selection and prediction consistency. Our theoretical results quantify the tradeoff between heavy-tailedness of the random noise and statistical bias. For random noise with bounded th moment with , the rate of convergence is a function of , and is slower than the sub-Gaussian-type deviation bounds; for random noise with bounded second moment, we obtain a rate of convergence as if sub-Gaussian noise were assumed. We illustrate the performance of the proposed method via extensive numerical studies and a data application. Supplementary materials for this article are available online.

摘要

我们提出了一种稀疏降秩Huber回归方法,用于分析具有重尾随机噪声的大型复杂高维数据。所提出的方法基于对秩和稀疏性约束的非凸优化问题的凸松弛,然后使用块坐标下降法和乘子交替方向法进行求解。在高维情况下,我们建立了Frobenius范数和核范数下的非渐近估计误差界。这是相对于现有降秩回归结果的一个主要贡献,现有结果主要集中在秩选择和预测一致性上。我们的理论结果量化了随机噪声的重尾性与统计偏差之间的权衡。对于具有有界第 阶矩( )的随机噪声,收敛速度是 的函数,并且比重尾高斯型偏差界慢;对于具有有界二阶矩的随机噪声,我们得到的收敛速度就好像假设是重尾高斯噪声一样。我们通过广泛的数值研究和一个数据应用来说明所提出方法的性能。本文的补充材料可在线获取。

相似文献

1
Sparse Reduced Rank Huber Regression in High Dimensions.
J Am Stat Assoc. 2023;118(544):2383-2393. doi: 10.1080/01621459.2022.2050243. Epub 2022 Apr 15.
2
Adaptive Huber Regression.
J Am Stat Assoc. 2020;115(529):254-265. doi: 10.1080/01621459.2018.1543124. Epub 2019 Apr 22.
3
A SHRINKAGE PRINCIPLE FOR HEAVY-TAILED DATA: HIGH-DIMENSIONAL ROBUST LOW-RANK MATRIX RECOVERY.
Ann Stat. 2021 Jun;49(3):1239-1266. doi: 10.1214/20-aos1980. Epub 2021 Aug 9.
4
Hyperspectral Images Denoising via Nonconvex Regularized Low-Rank and Sparse Matrix Decomposition.
IEEE Trans Image Process. 2020;29:44-56. doi: 10.1109/TIP.2019.2926736. Epub 2019 Jul 12.
5
Robust Tensor Completion via Capped Frobenius Norm.
IEEE Trans Neural Netw Learn Syst. 2024 Jul;35(7):9700-9712. doi: 10.1109/TNNLS.2023.3236415. Epub 2024 Jul 8.
6
Sparse low-rank separated representation models for learning from data.
Proc Math Phys Eng Sci. 2019 Jan;475(2221):20180490. doi: 10.1098/rspa.2018.0490. Epub 2019 Jan 9.
7
NOISY MATRIX COMPLETION: UNDERSTANDING STATISTICAL GUARANTEES FOR CONVEX RELAXATION VIA NONCONVEX OPTIMIZATION.
SIAM J Optim. 2020;30(4):3098-3121. doi: 10.1137/19m1290000. Epub 2020 Oct 28.
8
Off-Grid DOA Estimation Using Alternating Block Coordinate Descent in Compressed Sensing.
Sensors (Basel). 2015 Aug 27;15(9):21099-113. doi: 10.3390/s150921099.
9
Adaptive Rank and Structured Sparsity Corrections for Hyperspectral Image Restoration.
IEEE Trans Cybern. 2022 Sep;52(9):8729-8740. doi: 10.1109/TCYB.2021.3051656. Epub 2022 Aug 18.
10

引用本文的文献

1
Robust convex biclustering with a tuning-free method.
J Appl Stat. 2024 Jun 17;52(2):271-286. doi: 10.1080/02664763.2024.2367143. eCollection 2025.

本文引用的文献

1
Adaptive Huber Regression.
J Am Stat Assoc. 2020;115(529):254-265. doi: 10.1080/01621459.2018.1543124. Epub 2019 Apr 22.
2
I-LAMM FOR SPARSE LEARNING: SIMULTANEOUS CONTROL OF ALGORITHMIC COMPLEXITY AND STATISTICAL ERROR.
Ann Stat. 2018 Apr;46(2):814-841. doi: 10.1214/17-AOS1568. Epub 2018 Apr 3.
3
Robust reduced-rank regression.
Biometrika. 2017 Sep;104(3):633-647. doi: 10.1093/biomet/asx032. Epub 2017 Jul 12.
4
The cluster graphical lasso for improved estimation of Gaussian graphical models.
Comput Stat Data Anal. 2015 May;85:23-36. doi: 10.1016/j.csda.2014.11.015.
5
Reduced rank regression via adaptive nuclear norm penalization.
Biometrika. 2013 Dec 4;100(4):901-920. doi: 10.1093/biomet/ast036.
6
Reduced Rank Ridge Regression and Its Kernel Extensions.
Stat Anal Data Min. 2011 Dec;4(6):612-622. doi: 10.1002/sam.10138. Epub 2011 Oct 7.
7
Robust recovery of subspace structures by low-rank representation.
IEEE Trans Pattern Anal Mach Intell. 2013 Jan;35(1):171-84. doi: 10.1109/TPAMI.2012.88.
8
An Arabidopsis gene network based on the graphical Gaussian model.
Genome Res. 2007 Nov;17(11):1614-25. doi: 10.1101/gr.6911207. Epub 2007 Oct 5.
9
Sparse graphical Gaussian modeling of the isoprenoid gene network in Arabidopsis thaliana.
Genome Biol. 2004;5(11):R92. doi: 10.1186/gb-2004-5-11-r92. Epub 2004 Oct 25.

文献AI研究员

20分钟写一篇综述,助力文献阅读效率提升50倍。

立即体验

用中文搜PubMed

大模型驱动的PubMed中文搜索引擎

马上搜索

文档翻译

学术文献翻译模型,支持多种主流文档格式。

立即体验