Analysis dictionary learning using block coordinate descent framework with proximal operators

被引:7
|
作者
Li, Zhenni [1 ]
Ding, Shuxue [1 ]
Hayashi, Takafumi [2 ]
Li, Yujie [3 ]
机构
[1] Univ Aizu, Sch Comp Sci & Engn, Aizu Wakamatsu, Fukushima 9658580, Japan
[2] Niigata Univ, Grad Sch Sci & Technol, Niigata 9502181, Japan
[3] AIST, Ctr Artificial Intelligence, Tsukuba, Ibaraki 3058560, Japan
关键词
Sparse representation model; Analysis dictionary learning; Block coordinate descent framework; Incoherence; Proximal operator; SPARSE REPRESENTATION; K-SVD; IMAGE; ALGORITHM;
D O I
10.1016/j.neucom.2017.02.014
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this study, we propose two analysis dictionary learning algorithms for sparse representation with analysis model. The problem is formulated with the l(1)-norm regularizer and with two penalty terms on the analysis dictionary: the term of -log det(Omega(T)Omega) and the coherence penalty term. As the processing scheme, we employ a block coordinate descent framework, so that the overall problem is transformed into a set of minimizations of univariate subproblems with respect to a single-vector variable. Each subproblem is still nonsmooth, but it can be solved by a proximal operator and then the closed-form solutions can be obtained directly and explicitly. In particular, the coherence penalty, excluding excessively similar or repeated dictionary atoms, is solved at the same time as the dictionary update, thereby reducing the complexity. Furthermore, a scheme with a group of atoms is introduced in one proposed algorithm, which has a lower complexity. According to our analysis and simulation study, the main advantages of the proposed algorithms are their greater dictionary recovery ratios especially in the low-cosparsity case, and their faster running time of reaching the stable values of the dictionary recovery ratios and the recovery cosparsity compared with state-of-the-art algorithms. In addition, one proposed algorithm performs well in image denoising and in noise cancellation. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:165 / 180
页数:16
相关论文
共 50 条
  • [1] Nonconvex Regularized Robust PCA Using the Proximal Block Coordinate Descent Algorithm
    Wen, Fei
    Ying, Rendong
    Liu, Peilin
    Truong, Trieu-Kien
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2019, 67 (20) : 5402 - 5416
  • [2] Dictionary Learning Based on Nonnegative Matrix Factorization Using Parallel Coordinate Descent
    Tang, Zunyi
    Ding, Shuxue
    Li, Zhenni
    Jiang, Linlin
    ABSTRACT AND APPLIED ANALYSIS, 2013,
  • [3] Blockwise coordinate descent schemes for efficient and effective dictionary learning
    Liu, Bao-Di
    Wang, Yu-Xiong
    Shen, Bin
    Li, Xue
    Zhang, Yu-Jin
    Wang, Yan-Jiang
    NEUROCOMPUTING, 2016, 178 : 25 - 35
  • [4] Global Convergence of Block Coordinate Descent in Deep Learning
    Zeng, Jinshan
    Lau, Tim Tsz-Kit
    Lin, Shao-Bo
    Yao, Yuan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [5] A Block Coordinate Descent Proximal Method for Simultaneous Filtering and Parameter Estimation
    Raziperchikolaei, Ramin
    Bhat, Harish S.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [6] Sparse nonnegative tensor decomposition using proximal algorithm and inexact block coordinate descent scheme
    Deqing Wang
    Zheng Chang
    Fengyu Cong
    Neural Computing and Applications, 2021, 33 : 17369 - 17387
  • [7] Sparse nonnegative tensor decomposition using proximal algorithm and inexact block coordinate descent scheme
    Wang, Deqing
    Chang, Zheng
    Cong, Fengyu
    Neural Computing and Applications, 2021, 33 (24) : 17369 - 17387
  • [8] Sparse nonnegative tensor decomposition using proximal algorithm and inexact block coordinate descent scheme
    Wang, Deqing
    Chang, Zheng
    Cong, Fengyu
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (24): : 17369 - 17387
  • [9] Sparse Representation and Dictionary Learning Based on Alternating Parallel Coordinate Descent
    Tang, Zunyi
    Tamura, Toshiyo
    Ding, Shuxue
    Li, Zhenni
    2013 INTERNATIONAL JOINT CONFERENCE ON AWARENESS SCIENCE AND TECHNOLOGY & UBI-MEDIA COMPUTING (ICAST-UMEDIA), 2013, : 491 - +
  • [10] Scalable Nonparametric Low-Rank Kernel Learning Using Block Coordinate Descent
    Hu, En-Liang
    Kwok, James T.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (09) : 1927 - 1938