ALORA: Affine Low-Rank Approximations

被引:0
|
作者
Alan Ayala
Xavier Claeys
Laura Grigori
机构
[1] INRIA Paris,Laboratoire Jacques
[2] Sorbonne Université,Louis Lions
[3] Univ Paris-Diderot SPC,Laboratoire Jacques
[4] CNRS,Louis Lions
[5] équipe ALPINES,undefined
[6] Sorbonne Université,undefined
[7] Univ Paris-Diderot SPC,undefined
[8] CNRS,undefined
[9] INRIA,undefined
[10] équipe ALPINES,undefined
来源
Journal of Scientific Computing | 2019年 / 79卷
关键词
Low rank; QR factorization; Subspace iteration; Affine subspaces; 65F25; 65F30;
D O I
暂无
中图分类号
学科分类号
摘要
In this paper we present the concept of affine low-rank approximation for an m×n\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$m\times n$$\end{document} matrix, consisting in fitting its columns into an affine subspace of dimension at most k≪min(m,n)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$k \ll \min (m,n)$$\end{document}. We present the algorithm ALORA that constructs an affine approximation by slightly modifying the application of any low-rank approximation method. We focus on approximations created with the classical QRCP and subspace iteration algorithms. For the former, we discuss existing pivoting techniques and provide a bound for the error when an arbitrary pivoting technique is used. For the case of fsubspace iteration, we prove a result on the convergence of singular vectors, showing a bound that agrees with the one recently proved for the convergence of singular values. Finally, we present numerical experiments using challenging matrices taken from different fields, showing good performance and validating the theoretical framework.
引用
收藏
页码:1135 / 1160
页数:25
相关论文
共 50 条
  • [11] Parametric PDEs: sparse or low-rank approximations?
    Bachmayr, Markus
    Cohen, Albert
    Dahmen, Wolfgang
    IMA JOURNAL OF NUMERICAL ANALYSIS, 2018, 38 (04) : 1661 - 1708
  • [12] Robust low-rank data matrix approximations
    Feng XingDong
    He XuMing
    SCIENCE CHINA-MATHEMATICS, 2017, 60 (02) : 189 - 200
  • [13] Connectome Smoothing via Low-Rank Approximations
    Tang, Runze
    Ketcha, Michael
    Badea, Alexandra
    Calabrese, Evan D.
    Margulies, Daniel S.
    Vogelstein, Joshua T.
    Priebe, Carey E.
    Sussman, Daniel L.
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2019, 38 (06) : 1446 - 1456
  • [14] Low-rank approximations of nonseparable panel models
    Fernandez-Val, Ivan
    Freeman, Hugo
    Weidner, Martin
    ECONOMETRICS JOURNAL, 2021, 24 (02): : C40 - C77
  • [15] Robust low-rank data matrix approximations
    FENG XingDong
    HE XuMing
    Science China(Mathematics), 2017, 60 (02) : 189 - 200
  • [16] Fast computation of low-rank matrix approximations
    Achlioptas, Dimitris
    McSherry, Frank
    JOURNAL OF THE ACM, 2007, 54 (02)
  • [17] Generalized Low-Rank Approximations of Matrices Revisited
    Liu, Jun
    Chen, Songcan
    Zhou, Zhi-Hua
    Tan, Xiaoyang
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2010, 21 (04): : 621 - 632
  • [18] A minimum norm approach for low-rank approximations of a matrix
    Dax, Achiya
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2010, 234 (11) : 3091 - 3103
  • [19] LOW-RANK SIGNAL APPROXIMATIONS WITH REDUCED ERROR DISPERSION
    Zarzoso, Vicente
    Meo, Marianna
    Meste, Olivier
    2013 PROCEEDINGS OF THE 21ST EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2013,
  • [20] Robust approximations of low-rank minimization for tensor completion
    Gao, Shangqi
    Zhuang, Xiahai
    NEUROCOMPUTING, 2020, 379 : 319 - 333