On Predictive Density Estimation under α-Divergence Loss

被引:2
|
作者
L'Moudden, A. [1 ]
Marchand, E. [1 ]
机构
[1] Univ Sherbrooke, Dept Math, Sherbrooke, PQ, Canada
基金
加拿大自然科学与工程研究理事会;
关键词
alpha-divergence; dominance; frequentist risk; Hellinger loss; multivariate normal; plug-in; predictive density; restricted parameter space; variance expansion; LOCATION FAMILIES;
D O I
10.3103/S1066530719020030
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
Based on X similar to N-d(theta, sigma X2I(d)), we study the efficiency of predictive densities under alpha-divergence loss L-alpha for estimating the density of Y similar to N-d(theta, sigma Y2I(d)). We identify a large number of cases where improvement on a plug-in density are obtainable by expanding the variance, thus extending earlier findings applicable to Kullback-Leibler loss. The results and proofs are unified with respect to the dimension d, the variances sigma X2 and sigma Y2, the choice of loss L-alpha; alpha is an element of (-1, 1). The findings also apply to a large number of plug-in densities, as well as for restricted parameter spaces with theta is an element of Theta subset of Double-struck capital R-d. The theoretical findings are accompanied by various observations, illustrations, and implications dealing for instance with robustness with respect to the model variances and simultaneous dominance with respect to the loss.
引用
收藏
页码:127 / 143
页数:17
相关论文
共 50 条
  • [1] On Predictive Density Estimation under α-Divergence Loss
    A. L’Moudden
    È. Marchand
    Mathematical Methods of Statistics, 2019, 28 : 127 - 143
  • [2] Stochastic domination in predictive density estimation for ordered normal means under α-divergence loss
    Chang, Yuan-Tsung
    Strawderman, William E.
    JOURNAL OF MULTIVARIATE ANALYSIS, 2014, 128 : 1 - 9
  • [3] Predictive density estimation under the Wasserstein loss
    Matsuda, Takeru
    Strawderman, William E.
    JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2021, 210 : 53 - 63
  • [4] Pitman closeness domination in predictive density estimation for two-ordered normal means under α-divergence loss
    Chang, Yuan-Tsung
    Shinozaki, Nobuo
    Strawderman, William E.
    JAPANESE JOURNAL OF STATISTICS AND DATA SCIENCE, 2020, 3 (01) : 1 - 21
  • [5] β-divergence loss for the kernel density estimation with bias reduced
    Dhaker, Hamza
    Deme, El Hadji
    Ciss, Youssou
    STATISTICAL THEORY AND RELATED FIELDS, 2021, 5 (03) : 221 - 231
  • [6] On predictive density estimation for location families under integrated squared error loss
    Kubokawa, Tatsuya
    Marchand, Eric
    Strawderman, William E.
    JOURNAL OF MULTIVARIATE ANALYSIS, 2015, 142 : 57 - 74
  • [7] On predictive density estimation for location families under integrated absolute error loss
    Kubokawa, Tatsuya
    Marchand, Eric
    Strawderman, William E.
    BERNOULLI, 2017, 23 (4B) : 3197 - 3212
  • [8] Estimation, prediction and the Stein phenomenon under divergence loss
    Ghosh, Malay
    Mergel, Victor
    Datta, Gauri Sankar
    JOURNAL OF MULTIVARIATE ANALYSIS, 2008, 99 (09) : 1941 - 1961
  • [9] On Density Estimation under Relative Entropy Loss Criterion
    M. V. Burnashev
    S. Amari
    Problems of Information Transmission, 2002, 38 (4) : 323 - 346
  • [10] Hierarchical Empirical Bayes Estimation of Two Sample Means Under Divergence Loss
    Ghosh M.
    Kubokawa T.
    Sankhya A, 2018, 80 (Suppl 1): : 70 - 83