LEAST SQUARES APPROXIMATIONS IN LINEAR STATISTICALINVERSE LEARNING PROBLEMS

被引:0
|
作者
Helin, Tapio [1 ]
机构
[1] LUT Univ, Sch Engn Sci, POB 20, FI-53851 Lappeenranta, Finland
关键词
inverse problems; least squares approximations; statistical learning; minimax; DISCRETIZATION LEVEL CHOICE; INVERSE PROBLEMS; TIKHONOV REGULARIZATION; CONVERGENCE ANALYSIS; SELF-REGULARIZATION; PROJECTION METHODS; RATES; ALGORITHMS; EQUATIONS;
D O I
10.1137/22M1538600
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Statistical inverse learning aims at recovering an unknown function f from randomly scattered and possibly noisy point evaluations of another function g, connected to f via an illposed mathematical model. In this paper we blend statistical inverse learning theory with the classical regularization strategy of applying finite-dimensional projections. Our key finding is that coupling the number of random point evaluations with the choice of projection dimension, one can derive probabilistic convergence rates for the reconstruction error of the maximum likelihood (ML) estimator. Convergence rates in expectation are derived with a ML estimator complemented with a norm-based cutoff operation. Moreover, we prove that the obtained rates are minimax optimal.
引用
收藏
页码:2025 / 2047
页数:23
相关论文
共 50 条