A strong converse bound for multiple hypothesis testing, with applications to high-dimensional estimation

被引:0
|
作者
Venkataramanan, Ramji [1 ]
Johnson, Oliver [2 ]
机构
[1] Univ Cambridge, Dept Engn, Trumpington St, Cambridge CB3 0DZ, England
[2] Univ Bristol, Sch Math, Bristol BS8 1TW, Avon, England
来源
ELECTRONIC JOURNAL OF STATISTICS | 2018年 / 12卷 / 01期
关键词
Minimax lower bounds; Fano's inequality; compressed sensing; density estimation; active learning; MINIMAX RATES; LASSO;
D O I
10.1214/18-EJS1419
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
In statistical inference problems, we wish to obtain lower bounds on the minimax risk, that is to bound the performance of any possible estimator. A standard technique to do this involves the use of Fano's inequality. However, recent work in an information-theoretic setting has shown that an argument based on binary hypothesis testing gives tighter converse results (error lower bounds) than Fano for channel coding problems. We adapt this technique to the statistical setting, and argue that Fano's inequality can always be replaced by this approach to obtain tighter lower bounds that can be easily computed and are asymptotically sharp. We illustrate our technique in three applications: density estimation, active learning of a binary classifier, and compressed sensing, obtaining tighter risk lower bounds in each case.
引用
收藏
页码:1126 / 1149
页数:24
相关论文
共 50 条