Convolutional Analysis Operator Learning: Acceleration and Convergence

被引:34
|
作者
Chun, Il Yong [1 ,2 ]
Fessler, Jeffrey A. [1 ]
机构
[1] Univ Michigan, Dept Elect Engn & Comp Sci, Ann Arbor, MI 48019 USA
[2] Univ Hawaii Manoa, Dept Elect Engn, Honolulu, HI 96822 USA
关键词
Convolution; Training; Kernel; Convolutional codes; Computed tomography; Convergence; Image reconstruction; Convolutional regularizer learning; convolutional dictionary learning; convolutional neural networks; unsupervised machine learning algorithms; nonconvex-nonsmooth optimization; block coordinate descent; inverse problems; X-ray computed tomography; COORDINATE DESCENT METHOD; IMAGE-RECONSTRUCTION; SPARSE; OPTIMIZATION; ALGORITHM; DICTIONARIES;
D O I
10.1109/TIP.2019.2937734
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Convolutional operator learning is gaining attention in many signal processing and computer vision applications. Learning kernels has mostly relied on so-called patch-domain approaches that extract and store many overlapping patches across training signals. Due to memory demands, patch-domain methods have limitations when learning kernels from large datasets - particularly with multi-layered structures, e.g., convolutional neural networks - or when applying the learned kernels to high-dimensional signal recovery problems. The so-called convolution approach does not store many overlapping patches, and thus overcomes the memory problems particularly with careful algorithmic designs; it has been studied within the "synthesis" signal model, e.g., convolutional dictionary learning. This paper proposes a new convolutional analysis operator learning (CAOL) framework that learns an analysis sparsifying regularizer with the convolution perspective, and develops a new convergent Block Proximal Extrapolated Gradient method using a Majorizer (BPEG-M) to solve the corresponding block multi-nonconvex problems. To learn diverse filters within the CAOL framework, this paper introduces an orthogonality constraint that enforces a tight-frame filter condition, and a regularizer that promotes diversity between filters. Numerical experiments show that, with sharp majorizers, BPEG-M significantly accelerates the CAOL convergence rate compared to the state-of-the-art block proximal gradient (BPG) method. Numerical experiments for sparse-view computational tomography show that a convolutional sparsifying regularizer learned via CAOL significantly improves reconstruction quality compared to a conventional edge-preserving regularizer. Using more and wider kernels in a learned regularizer better preserves edges in reconstructed images.
引用
收藏
页码:2108 / 2122
页数:15
相关论文
共 50 条
  • [21] An Extended Multistep Shanks Transformation and Convergence Acceleration Algorithm with Their Convergence and Stability Analysis
    Sun, Jian-Qing
    Chang, Xiang-Ke
    He, Yi
    Hu, Xing-Biao
    NUMERISCHE MATHEMATIK, 2013, 125 (04) : 785 - 809
  • [22] Anderson Acceleration as a Krylov Method with Application to Convergence Analysis
    De Sterck, Hans
    He, Yunhui
    Krzysik, Oliver A.
    JOURNAL OF SCIENTIFIC COMPUTING, 2024, 99 (01)
  • [23] Anderson Acceleration as a Krylov Method with Application to Convergence Analysis
    Hans De Sterck
    Yunhui He
    Oliver A. Krzysik
    Journal of Scientific Computing, 2024, 99
  • [24] ACCELERATION OF CONVERGENCE OF SERIES OF OPERATORS IN NUMERICAL-ANALYSIS
    WYNN, P
    COMPTES RENDUS HEBDOMADAIRES DES SEANCES DE L ACADEMIE DES SCIENCES SERIE A, 1973, 276 (11): : 803 - 806
  • [25] Learning Filter Pruning Criteria for Deep Convolutional Neural Networks Acceleration
    He, Yang
    Ding, Yuhang
    Liu, Ping
    Zhu, Linchao
    Zhang, Hanwang
    Yang, Yi
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 2006 - 2015
  • [26] Robust Convergence Analysis of Three-Operator Splitting
    Wang, Han
    Fazlyab, Mahyar
    Chen, Shaoru
    Preciado, Victor M.
    2019 57TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2019, : 391 - 398
  • [27] Convergence Analysis of Linear Coupling with Inexact Proximal Operator
    Zhou, Qiang
    Pan, Sinno Jialin
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, VOL 180, 2022, 180 : 2394 - 2403
  • [28] Damped Anderson Mixing for Deep Reinforcement Learning: Acceleration, Convergence, and Stabilization
    Sun, Ke
    Wang, Yafei
    Liu, Yi
    Zhao, Yingnan
    Pan, Bo
    Jui, Shangling
    Jiang, Bei
    Kong, Linglong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [29] Convergence acceleration techniques
    Jentschura, UD
    Aksenov, SV
    Mohr, PJ
    Savageau, MA
    Soff, G
    NANOTECH 2003, VOL 2, 2003, : 535 - 537
  • [30] Convergence Acceleration of Computer Methods for Grounding Analysis in Stratified Soils
    Colominas, I.
    Paris, J.
    Navarrina, F.
    Casteleiro, M.
    9TH WORLD CONGRESS ON COMPUTATIONAL MECHANICS AND 4TH ASIAN PACIFIC CONGRESS ON COMPUTATIONAL MECHANICS, 2010, 10