Provable Phase Retrieval with Mirror Descent

被引:2
|
作者
Godeme, Jean-Jacques [1 ]
Fadili, Jalal [1 ]
Buet, Xavier [2 ]
Zerrad, Myriam [2 ]
Lequime, Michel [2 ]
Amra, Claude [2 ]
机构
[1] Normandie Univ, ENSICAEN, CNRS, GREYC, Caen, France
[2] Aix Marseille Univ, Inst Fresnel, CNRS, Cent Marseille, Marseille, France
来源
SIAM JOURNAL ON IMAGING SCIENCES | 2023年 / 16卷 / 03期
关键词
phase retrieval; inverse problems; mirror descent; random measurements; LIPSCHITZ GRADIENT CONTINUITY; LOCAL LINEAR CONVERGENCE; 1ST-ORDER METHODS; ALTERNATING PROJECTIONS; ALGORITHMS; CONVEX; RECONSTRUCTION; RECOVERY; PAIRS; MAGNITUDE;
D O I
10.1137/22M1528896
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we consider the problem of phase retrieval, which consists of recovering an n dimensional real vector from the magnitude of its m linear measurements. We propose a mirror descent (or Bregman gradient descent) algorithm based on a wisely chosen Bregman divergence, hence allowing us to remove the classical global Lipschitz continuity requirement on the gradient of the nonconvex phase retrieval objective to be minimized. We apply the mirror descent for two random measurements: the i.i.d. standard Gaussian and those obtained by multiple structured illuminations through coded diffraction patterns. For the Gaussian case, we show that when the number of measurements m is large enough, then with high probability, for almost all initializers, the algorithm recovers the original vector up to a global sign change. For both measurements, the mirror descent exhibits a local linear convergence behavior with a dimension-independent convergence rate. Finally, our theoretical results are illustrated with various numerical experiments, including an application to the reconstruction of images in precision optics.
引用
收藏
页码:1106 / 1141
页数:36
相关论文
共 50 条
  • [41] Unifying mirror descent and dual averaging
    Anatoli Juditsky
    Joon Kwon
    Éric Moulines
    Mathematical Programming, 2023, 199 : 793 - 830
  • [42] Shifting Regret, Mirror Descent, and Matrices
    Gyorgy, Andras
    Szepesvari, Csaba
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [43] An Accelerated Stochastic Mirror Descent Method
    Jiang, Bo-Ou
    Yuan, Ya-Xiang
    JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF CHINA, 2024, 12 (03) : 549 - 571
  • [44] Parameter-free Mirror Descent
    Jacobsen, Andrew
    Cutkosky, Ashok
    CONFERENCE ON LEARNING THEORY, VOL 178, 2022, 178
  • [45] Mirror Descent Learning in Continuous Games
    Zhou, Zhengyuan
    Mertikopoulos, Panayotis
    Moustakas, Aris L.
    Bambos, Nicholas
    Glynn, Peter
    2017 IEEE 56TH ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2017,
  • [46] Adaptive Mirror Descent for Constrained Optimization
    Bayandina, Anastasia
    2017 CONSTRUCTIVE NONSMOOTH ANALYSIS AND RELATED TOPICS (DEDICATED TO THE MEMORY OF V.F. DEMYANOV) (CNSA), 2017, : 36 - 39
  • [47] Policy Optimization with Stochastic Mirror Descent
    Yang, Long
    Zhang, Yu
    Zheng, Gang
    Zheng, Qian
    Li, Pengfei
    Huang, Jianghang
    Pan, Gang
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 8823 - 8831
  • [48] Mirror descent search and its acceleration
    Miyashita, Megumi
    Yano, Shiro
    Kondo, Toshiyuki
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2018, 106 : 107 - 116
  • [49] Convergence of the Iterates in Mirror Descent Methods
    Doan, Thinh T.
    Bose, Subhonmesh
    Nguyen, D. Hoa
    Beck, Carolyn L.
    IEEE CONTROL SYSTEMS LETTERS, 2019, 3 (01): : 114 - 119
  • [50] Implicit Bias of Gradient Descent on Reparametrized Models: On Equivalence to Mirror Descent
    Li, Zhiyuan
    Wang, Tianhao
    Lee, Jason D.
    Arora, Sanjeev
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,