Training neural networks on artificially generated data: a novel approach to SAR speckle removal

被引:4
|
作者
Van Coillie, F. M. B. [1 ]
Lievens, H. [2 ]
Joos, I. [1 ]
Pizurica, A. [3 ]
Verbeke, L. P. C. [4 ]
De Wulf, R. R. [1 ]
Verhoest, N. E. C. [2 ]
机构
[1] Univ Ghent, Lab Forest Management & Spatial Informat Techn, B-9000 Ghent, Belgium
[2] Univ Ghent, Lab Hydrol & Water Management, B-9000 Ghent, Belgium
[3] Univ Ghent, Dept Telecommun & Informat Proc, B-9000 Ghent, Belgium
[4] Geo Solut, B-2550 Kontich, Belgium
关键词
BAYESIAN WAVELET SHRINKAGE; IMAGES; NOISE; STATISTICS; REDUCTION;
D O I
10.1080/01431161003749436
中图分类号
TP7 [遥感技术];
学科分类号
081102 ; 0816 ; 081602 ; 083002 ; 1404 ;
摘要
A neural network-based method for speckle removal in synthetic aperture radar (SAR) images is introduced. The method rests on the idea that a neural network learning machine, trained on artificially generated input-target couples, can be used to efficiently process real SAR data. The explicit plus-point of the method is that it is trained with artificially generated data, reducing the demands put on real input data such as data quality, availability and cost price. The artificial data can be generated in such a way that they fit the particular characteristics of the images to be denoised, yielding case-specific, high-performing despeckling filters. Acomparative study with three classical denoising techniques (Enhanced Frost (EF), Enhanced Lee (EL) and GammaMAP (GM)) and a wavelet filter demonstrated a superior speckle removal performance of the proposed method in terms of quantitative performance measures. Moreover, qualitative evaluation of the despeckled results was in favour of the proposed method, confirming its speckle removal efficiency.
引用
收藏
页码:3405 / 3425
页数:21
相关论文
共 50 条
  • [41] Training Neural Networks Using Input Data Characteristics
    Cernazanu, Cosmin
    ADVANCES IN ELECTRICAL AND COMPUTER ENGINEERING, 2008, 8 (02) : 65 - 70
  • [42] Partial data permutation for training deep neural networks
    Cong, Guojing
    Zhang, Li
    Yang, Chih-Chieh
    2020 20TH IEEE/ACM INTERNATIONAL SYMPOSIUM ON CLUSTER, CLOUD AND INTERNET COMPUTING (CCGRID 2020), 2020, : 728 - 735
  • [43] On data selection for training wind forecasting neural networks
    Homsi Goulart, Antonio Jose
    de Camargo, Ricardo
    COMPUTERS & GEOSCIENCES, 2021, 155
  • [44] Preparing the right data diet for training neural networks
    Yale, K
    IEEE SPECTRUM, 1997, 34 (03) : 64 - 66
  • [45] Explaining neural networks without access to training data
    Sascha Marton
    Stefan Lüdtke
    Christian Bartelt
    Andrej Tschalzev
    Heiner Stuckenschmidt
    Machine Learning, 2024, 113 : 3633 - 3652
  • [46] Training Deep Neural Networks on Imbalanced Data Sets
    Wang, Shoujin
    Liu, Wei
    Wu, Jia
    Cao, Longbing
    Meng, Qinxue
    Kennedy, Paul J.
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 4368 - 4374
  • [47] IMPROVING THE EFFICIENCY OF NEURAL NETWORKS WITH VIRTUAL TRAINING DATA
    Hollosi, Janos
    Krecht, Rudolf
    Marko, Norbert
    Ballagi, Aron
    HUNGARIAN JOURNAL OF INDUSTRY AND CHEMISTRY, 2020, 48 (01): : 3 - 10
  • [48] Optimizing Data Layout for Training Deep Neural Networks
    Li, Bingyao
    Xue, Qi
    Yuan, Geng
    Li, Sheng
    Ma, Xiaolong
    Wang, Yanzhi
    Tang, Xulong
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 548 - 554
  • [49] CryptoNN: Training Neural Networks over Encrypted Data
    Xu, Runhua
    Joshi, James B. D.
    Li, Chao
    2019 39TH IEEE INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2019), 2019, : 1199 - 1209
  • [50] Data-Efficient Augmentation for Training Neural Networks
    Liu, Tian Yu
    Mirzasoleiman, Baharan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,