Self-Activated Implicit Neural Representation for Synthetic Aperture Radar Images

被引:0
|
作者
Han, Dongshen [1 ]
Zhang, Chaoning [1 ]
机构
[1] Kyung Hee Univ, Sch Comp, Yongin 17104, South Korea
关键词
implicit neural representation; synthetic aperture radar; self-activation; SPECKLE REDUCTION;
D O I
10.3390/rs16234473
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Image Implicit Neural Representations (INRs) adopt a neural network to learn a continuous function for mapping the pixel coordinates to their corresponding values. This task has gained significant attention for representing images in a continuous manner. Despite substantial progress regarding natural images, there is little investigation of INRs for Synthetic Aperture Radar (SAR) images. This work takes a pioneering effort to study INRs for SAR images and finds that fine details are hard to represent. It has been shown in prior works that fine details can be easier to learn when the model weights are better initialized, which motivated us to investigate the benefits of activating the model weight before target training. The challenge of this task lies in the fact that SAR images cannot be used during the model activation stage. To this end, we propose exploiting a cross-pixel relationship of the model output, which relies on no target images. Specifically, we design a novel self-activation method by alternatively using two loss functions: a loss used to smooth out the model output, and another used for the opposite purpose. Extensive results on SAR images empirically show that our proposed method helps improve the model performance by a non-trivial margin.
引用
收藏
页数:20
相关论文
共 50 条
  • [41] Identification of airfield runways in synthetic aperture radar images
    Finch, I
    Antonacopoulos, A
    FOURTEENTH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1 AND 2, 1998, : 1633 - 1636
  • [42] Early images from the RADARSAT synthetic aperture radar
    Luscombe, AP
    Gray, R
    Shepherd, N
    Srivastava, S
    Meier, D
    Jefferies, W
    IGARSS '96 - 1996 INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM: REMOTE SENSING FOR A SUSTAINABLE FUTURE, VOLS I - IV, 1996, : 1352 - 1354
  • [43] DIGITAL TECHNIQUE FOR GENERATING SYNTHETIC APERTURE RADAR IMAGES
    VANDELINDT, WJ
    IBM JOURNAL OF RESEARCH AND DEVELOPMENT, 1977, 21 (05) : 415 - 432
  • [44] A new automatic segmentation for synthetic aperture radar images
    Shi, QF
    Li, Y
    Zhang, YN
    PROCEEDINGS OF THE 2004 INTERNATIONAL SYMPOSIUM ON INTELLIGENT MULTIMEDIA, VIDEO AND SPEECH PROCESSING, 2004, : 739 - 742
  • [45] AUTOMATIC INTERPRETATION STRATEGIES FOR SYNTHETIC APERTURE RADAR IMAGES
    QUEGAN, S
    RYE, AJ
    HENDRY, A
    SKINGLEY, J
    ODDY, CJ
    PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY A-MATHEMATICAL PHYSICAL AND ENGINEERING SCIENCES, 1988, 324 (1579): : 409 - +
  • [46] Speckle reduction and segmentation of Synthetic Aperture Radar images
    Smith, DM
    INTERNATIONAL JOURNAL OF REMOTE SENSING, 1996, 17 (11) : 2043 - 2057
  • [47] Simulation of unpolarized scattering in synthetic aperture radar images
    Tomiyasu, K.
    IEEE Transactions on Geoscience and Remote Sensing, 1999, 37 (2 II): : 1176 - 1179
  • [48] Automatic registration of Synthetic Aperture Radar (SAR) images
    Luong, HQ
    Gautama, S
    Philips, W
    IGARSS 2004: IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM PROCEEDINGS, VOLS 1-7: SCIENCE FOR SOCIETY: EXPLORING AND MANAGING A CHANGING PLANET, 2004, : 3864 - 3867
  • [49] Improved region convolutional neural network for ship detection in multiresolution synthetic aperture radar images
    Xiao, Qilin
    Cheng, Yun
    Xiao, Minlei
    Zhang, Jun
    Shi, Hongji
    Niu, Lihui
    Ge, Chenguang
    Lang, Haitao
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2020, 32 (22):
  • [50] A Robust Tie-Points Matching Method with Regional Feature Representation for Synthetic Aperture Radar Images
    Zhang, Yifan
    Zhu, Yan
    Liu, Liqun
    Du, Xun
    Han, Kun
    Wu, Junhui
    Li, Zhiqiang
    Kong, Lingshuai
    Lin, Qiwei
    REMOTE SENSING, 2024, 16 (13)