Ghosh Satyajit, Khare Kshitij, Michailidis George
Department of Statistics and the Informatics Institute, University of Florida.
J Am Stat Assoc. 2019;114(526):735-748. doi: 10.1080/01621459.2018.1437043. Epub 2018 Aug 7.
Vector autoregressive (VAR) models aim to capture linear temporal interdependencies amongst multiple time series. They have been widely used in macroeconomics and financial econometrics and more recently have found novel applications in functional genomics and neuroscience. These applications have also accentuated the need to investigate the behavior of the VAR model in a high-dimensional regime, which provides novel insights into the role of temporal dependence for regularized estimates of the model's parameters. However, hardly anything is known regarding properties of the posterior distribution for Bayesian VAR models in such regimes. In this work, we consider a VAR model with two prior choices for the autoregressive coefficient matrix: a non-hierarchical matrix-normal prior and a hierarchical prior, which corresponds to an scale mixture of normals. We establish posterior consistency for both these priors under standard regularity assumptions, when the dimension of the VAR model grows with the sample size (but still remains smaller than ). A special case corresponds to a shrinkage prior that introduces (group) sparsity in the columns of the model coefficient matrices. The performance of the model estimates are illustrated on synthetic and real macroeconomic data sets.
向量自回归(VAR)模型旨在捕捉多个时间序列之间的线性时间依存关系。它们已在宏观经济学和金融计量经济学中广泛使用,并且最近在功能基因组学和神经科学中发现了新的应用。这些应用也凸显了在高维情况下研究VAR模型行为的必要性,这为时间依赖性在模型参数正则估计中的作用提供了新的见解。然而,对于这种情况下贝叶斯VAR模型后验分布的性质却知之甚少。在这项工作中,我们考虑一个VAR模型,对自回归系数矩阵有两种先验选择:一种是非层次矩阵正态先验,另一种是层次先验,它对应于正态分布的尺度混合。在标准正则性假设下,当VAR模型的维度随着样本量增长(但仍小于)时,我们建立了这两种先验的后验一致性。一个特殊情况对应于一种收缩先验,它在模型系数矩阵的列中引入(组)稀疏性。在合成和实际宏观经济数据集上展示了模型估计的性能。
J Am Stat Assoc. 2019
J Appl Econ (Chichester Engl). 2021
J Am Stat Assoc. 2015-12-1
JMLR Workshop Conf Proc. 2015-7
Comput Stat Data Anal. 2019-11
EURASIP J Bioinform Syst Biol. 2016-1-20
J Mach Learn Res. 2024-4
Annu Rev Stat Appl. 2022-3
J Mach Learn Res. 2015
J Neurosci. 2015-2-25
Adv Neural Inf Process Syst. 2011
Stat Sin. 2013-1-1