Euclid in a Taxicab: Sparse Blind Deconvolution with Smoothed l1/l2 Regularization

被引:78
|
作者
Repetti, Audrey [1 ]
Mai Quyen Pham [1 ,2 ]
Duval, Laurent [2 ]
Chouzenoux, Emilie [1 ]
Pesquet, Jean-Christophe [1 ]
机构
[1] Univ Paris Est, LIGM UMR CNRS 8049, F-77454 Champs Sur Marne, France
[2] IFP Energies Nouvelles, F-92500 Rueil Malmaison, France
关键词
Blind deconvolution; nonconvex optimization; norm ratio; preconditioned forward-backward algorithm; seismic data processing; sparsity; smoothed l(1)/l(2) regularization; COORDINATE DESCENT METHOD; NONNEGATIVE MATRIX; FACTORIZATION; CONVERGENCE; SIGNALS;
D O I
10.1109/LSP.2014.2362861
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The l(1)/l(2) ratio regularization function has shown good performance for retrieving sparse signals in a number of recent works, in the context of blind deconvolution. Indeed, it benefits from a scale invariance property much desirable in the blind context. However, the l(1)/l(2) function raises some difficulties when solving the nonconvex and nonsmooth minimization problems resulting from the use of such a penalty term in current restoration methods. In this paper, we propose a new penalty based on a smooth approximation to the l(1)/l(2) function. In addition, we develop a proximal-based algorithm to solve variational problems involving this function and we derive theoretical convergence results. We demonstrate the effectiveness of our method through a comparison with a recent alternating optimization strategy dealing with the exact l(1)/l(2) term, on an application to seismic data blind deconvolution.
引用
收藏
页码:539 / 543
页数:5
相关论文
共 50 条
  • [31] L1/2 regularization
    ZongBen Xu
    Hai Zhang
    Yao Wang
    XiangYu Chang
    Yong Liang
    Science China Information Sciences, 2010, 53 : 1159 - 1169
  • [32] L1/2 regularization
    XU ZongBen 1
    2 Department of Mathematics
    3 University of Science and Technology
    Science China(Information Sciences), 2010, 53 (06) : 1159 - 1169
  • [33] Image Reconstruction in Ultrasonic Transmission Tomography Using L1/L2 Regularization
    Li, Aoyu
    Liang, Guanghui
    Dong, Feng
    2024 IEEE INTERNATIONAL INSTRUMENTATION AND MEASUREMENT TECHNOLOGY CONFERENCE, I2MTC 2024, 2024,
  • [34] Sentiment Analysis of Tweets by Convolution Neural Network with L1 and L2 Regularization
    Rangra, Abhilasha
    Sehgal, Vivek Kumar
    Shukla, Shailendra
    ADVANCED INFORMATICS FOR COMPUTING RESEARCH, ICAICR 2018, PT I, 2019, 955 : 355 - 365
  • [35] ON L1 TRANSFER IN L2 COMPREHENSION AND L2 PRODUCTION
    RINGBOM, H
    LANGUAGE LEARNING, 1992, 42 (01) : 85 - 112
  • [36] Parameter choices for sparse regularization with the l1 norm
    Liu, Qianru
    Wang, Rui
    Xu, Yuesheng
    Yan, Mingsong
    INVERSE PROBLEMS, 2023, 39 (02)
  • [37] L2 x L2 → L1 boundedness criteria
    Grafakos, Loukas
    He, Danqing
    Slavikova, Lenka
    MATHEMATISCHE ANNALEN, 2020, 376 (1-2) : 431 - 455
  • [38] SPARSE REPRESENTATION LEARNING OF DATA BY AUTOENCODERS WITH L1/2 REGULARIZATION
    Li, F.
    Zurada, J. M.
    Wu, W.
    NEURAL NETWORK WORLD, 2018, 28 (02) : 133 - 147
  • [39] Sparse kernel logistic regression based on L1/2 regularization
    Xu Chen
    Peng ZhiMing
    Jing WenFeng
    SCIENCE CHINA-INFORMATION SCIENCES, 2013, 56 (04) : 1 - 16
  • [40] Sparse Hopfield network reconstruction with l1 regularization
    Huang, Haiping
    EUROPEAN PHYSICAL JOURNAL B, 2013, 86 (11):