Finite-Sample Risk Bounds for Maximum Likelihood Estimation With Arbitrary Penalties

被引:1
|
作者
Brinda, W. D. [1 ]
Klusowski, Jason M. [1 ]
机构
[1] Yale Univ, Dept Stat & Data Sci, New Haven, CT 06511 USA
关键词
Penalized likelihood estimation; minimum description length; codelength; statistical risk; redundancy; COMPLEXITY DENSITY-ESTIMATION;
D O I
10.1109/TIT.2017.2789214
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The minimum description length two-part coding index of resolvability provides a finite-sample upper bound on the statistical risk of penalized likelihood estimators over countable models. However, the bound does not apply to unpenalized maximum likelihood estimation or procedures with exceedingly small penalties. In this paper, we point out a more general inequality that holds for arbitrary penalties. In addition, this approach makes it possible to derive exact risk bounds of order 1/n for iid parametric models, which improves on the order (log n)/n resolvability bounds. We conclude by discussing implications for adaptive estimation.
引用
收藏
页码:2727 / 2741
页数:15
相关论文
共 50 条