Convolutional Analysis Operator Learning: Acceleration and Convergence

被引:34
|
作者
Chun, Il Yong [1 ,2 ]
Fessler, Jeffrey A. [1 ]
机构
[1] Univ Michigan, Dept Elect Engn & Comp Sci, Ann Arbor, MI 48019 USA
[2] Univ Hawaii Manoa, Dept Elect Engn, Honolulu, HI 96822 USA
关键词
Convolution; Training; Kernel; Convolutional codes; Computed tomography; Convergence; Image reconstruction; Convolutional regularizer learning; convolutional dictionary learning; convolutional neural networks; unsupervised machine learning algorithms; nonconvex-nonsmooth optimization; block coordinate descent; inverse problems; X-ray computed tomography; COORDINATE DESCENT METHOD; IMAGE-RECONSTRUCTION; SPARSE; OPTIMIZATION; ALGORITHM; DICTIONARIES;
D O I
10.1109/TIP.2019.2937734
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional operator learning is gaining attention in many signal processing and computer vision applications. Learning kernels has mostly relied on so-called patch-domain approaches that extract and store many overlapping patches across training signals. Due to memory demands, patch-domain methods have limitations when learning kernels from large datasets - particularly with multi-layered structures, e.g., convolutional neural networks - or when applying the learned kernels to high-dimensional signal recovery problems. The so-called convolution approach does not store many overlapping patches, and thus overcomes the memory problems particularly with careful algorithmic designs; it has been studied within the "synthesis" signal model, e.g., convolutional dictionary learning. This paper proposes a new convolutional analysis operator learning (CAOL) framework that learns an analysis sparsifying regularizer with the convolution perspective, and develops a new convergent Block Proximal Extrapolated Gradient method using a Majorizer (BPEG-M) to solve the corresponding block multi-nonconvex problems. To learn diverse filters within the CAOL framework, this paper introduces an orthogonality constraint that enforces a tight-frame filter condition, and a regularizer that promotes diversity between filters. Numerical experiments show that, with sharp majorizers, BPEG-M significantly accelerates the CAOL convergence rate compared to the state-of-the-art block proximal gradient (BPG) method. Numerical experiments for sparse-view computational tomography show that a convolutional sparsifying regularizer learned via CAOL significantly improves reconstruction quality compared to a conventional edge-preserving regularizer. Using more and wider kernels in a learned regularizer better preserves edges in reconstructed images.
引用
收藏
页码:2108 / 2122
页数:15
相关论文
共 50 条
  • [1] Convolutional Dictionary Learning: Acceleration and Convergence
    Chun, Il Yong
    Fessler, Jeffrey A.
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (04) : 1697 - 1712
  • [2] Convergence Acceleration Operator for Multiobjective Optimization
    Adra, Salem F.
    Dodd, Tony J.
    Griffin, Ian A.
    Fleming, Peter J.
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2009, 13 (04) : 825 - 847
  • [3] Convolutional analysis operator learning for multifocus image fusion
    Zhang, Chengfang
    Feng, Ziliang
    SIGNAL PROCESSING-IMAGE COMMUNICATION, 2022, 103
  • [4] Convolutional Analysis Operator Learning: Dependence on Training Data
    Chun, Il Yong
    Hong, David
    Adcock, Ben
    Fessler, Jeffrey A.
    IEEE SIGNAL PROCESSING LETTERS, 2019, 26 (08) : 1137 - 1141
  • [5] Convergence of reinforcement learning algorithms and acceleration of learning
    Potapov, A
    Ali, MK
    PHYSICAL REVIEW E, 2003, 67 (02):
  • [6] Convolutional analysis operator learning: Application to sparse-view CT
    Chun, Il Yong
    Fessler, Jeffrey A.
    2018 CONFERENCE RECORD OF 52ND ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2018, : 1631 - 1635
  • [7] CONVERGENCE ANALYSIS FOR ANDERSON ACCELERATION
    Toth, Alex
    Kelley, C. T.
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2015, 53 (02) : 805 - 819
  • [8] Operator Semigroups for Convergence Analysis
    Csomos, Petra
    Farago, Istvan
    Fekete, Imre
    Finite Difference Methods, Theory and Applications, 2015, 9045 : 38 - 49
  • [9] ACCELERATION OF THE CONVERGENCE IN TRANSPORT-THEORY BY SPLITTING OPERATOR AND RELAXATION METHOD
    AKESBI, S
    NICOLET, M
    COMPTES RENDUS DE L ACADEMIE DES SCIENCES SERIE I-MATHEMATIQUE, 1995, 321 (05): : 637 - 640
  • [10] CONVOLUTIONAL ANALYSIS OPERATOR LEARNING BY END-TO-END TRAINING OF ITERATIVE NEURAL NETWORKS
    Kofler, Andreas
    Wald, Christian
    Schaeffter, Tobias
    Haltmeier, Markus
    Kolbitsch, Christoph
    2022 IEEE INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (IEEE ISBI 2022), 2022,