Wang Chen-Pin, Jo Booil
Department of Epidemiology and Biostatistics, University of Texas Health Science Center, San Antonio TX, 78229, USA.
Department of Psychiatry and Behavioral Sciences, Stanford University, Stanford CA 94305, USA.
Stat Modelling. 2013 Dec;13(5-6):409-429. doi: 10.1177/1471082X13494610.
Wang and Ghosh (2011) proposed a Kullback-Leibler divergence (KLD) which is asymptotically equivalent to the KLD by Goutis and Robert (1998) when the reference model (in comparison with a competing fitted model) is correctly specified and when certain regularity conditions hold true. While properties of the KLD by Wang and Ghosh (2011) have been investigated in the Bayesian framework, this paper further explores the property of this KLD in the frequentist framework using four application examples, each fitted by two competing non-nested models.
王和戈什(2011年)提出了一种库尔贝克-莱布勒散度(KLD),当参考模型(与竞争拟合模型相比)被正确设定且某些正则性条件成立时,该散度与古蒂斯和罗伯特(1998年)提出的KLD渐近等价。虽然王和戈什(2011年)提出的KLD的性质已在贝叶斯框架中进行了研究,但本文使用四个应用示例在频率主义框架中进一步探讨了该KLD的性质,每个示例均由两个竞争的非嵌套模型进行拟合。