Global Linear and Local Superlinear Convergence of IRLS for Non-Smooth Robust Regression

被引:0
|
作者
Peng, Liangzu [1 ]
Kummerle, Christian [2 ]
Vidal, Rene [1 ]
机构
[1] Johns Hopkins Univ, Math Inst Data Sci, Baltimore, MD 21218 USA
[2] Univ North Carolina Charlotte, Dept Comp Sci, Charlotte, NC USA
关键词
REWEIGHTED LEAST-SQUARES; FACE RECOGNITION; SPARSE RECOVERY; PHASE RETRIEVAL; RECONSTRUCTION; DECOMPOSITION; OPTIMIZATION; MINIMIZATION; SIGNALS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We advance both the theory and practice of robust l(p)-quasinorm regression for p is an element of(0, 1] by using novel variants of iteratively reweighted least-squares (IRLS) to solve the underlying non-smooth problem. In the convex case, p = 1, we prove that this IRLS variant converges globally at a linear rate under a mild, deterministic condition on the feature matrix called the stable range space property. In the non-convex case, p is an element of(0, 1), we prove that under a similar condition, IRLS converges locally to the global minimizer at a superlinear rate of order 2 - p; the rate becomes quadratic as p -> 0. We showcase the proposed methods in three applications: real phase retrieval, regression without correspondences, and robust face restoration. The results show that (1) IRLS can handle a larger number of outliers than other methods, (2) it is faster than competing methods at the same level of accuracy, (3) it restores a sparsely corrupted face image with satisfactory visual quality. https://github.com/liangzu/IRLS-NeurIPS2022
引用
收藏
页数:16
相关论文
共 50 条