Underwater image imbalance attenuation compensation based on attention and self-attention mechanism

被引:1
|
作者
Wang, Danxu [1 ]
Wei, Yanhui [1 ,2 ]
Liu, Junnan [1 ]
Ouyang, Wenjia [1 ]
Zhou, Xilin [1 ]
机构
[1] Harbin Engn Univ, Coll Intelligent Sci & Engn, Harbin, Peoples R China
[2] Harbin Engn Univ, Nanhai Inst, Sanya, Peoples R China
来源
关键词
underwater image restoration; attention; self-attention; imbalance attenuation compensation; ENHANCEMENT; QUALITY; VISIBILITY;
D O I
10.1109/OCEANS47191.2022.9977186
中图分类号
U6 [水路运输]; P75 [海洋工程];
学科分类号
0814 ; 081505 ; 0824 ; 082401 ;
摘要
The scattering and absorption of light through water will lead to underwater images suffering from low contrast and color variations. With the difference belong wavelength, RGB channels obtained non-uniform information. Although many works for underwater image restoration through CNNs, the color distortions caused by imbalance attenuation have not been addressed in previous contributions. In this paper, we demonstrate that employing green and blue channels to support the red channel to extract more depth features is helpful for underwater image recovery tasks. Further, we discard the previous CNN-based model by proposing a new model based on attention and a self-attention mechanism called underwater restoration attention self-attention (URAS). Our pipeline has achieved better performance than other baseline models on the EUVP dataset.
引用
收藏
页数:6
相关论文
共 50 条
  • [41] HIGSA: Human image generation with self-attention
    Wu, Haoran
    He, Fazhi
    Si, Tongzhen
    Duan, Yansong
    Yan, Xiaohu
    ADVANCED ENGINEERING INFORMATICS, 2023, 55
  • [42] Relation constraint self-attention for image captioning
    Ji, Junzhong
    Wang, Mingzhan
    Zhang, Xiaodan
    Lei, Minglong
    Qu, Liangqiong
    NEUROCOMPUTING, 2022, 501 : 778 - 789
  • [43] Self-Attention and Ingredient-Attention Based Model for Recipe Retrieval from Image Queries
    Fontanellaz, Matthias
    Christodoulidis, Stergios
    Mougiakakou, Stavroula
    MADIMA'19: PROCEEDINGS OF THE 5TH INTERNATIONAL WORKSHOP ON MULTIMEDIA ASSISTED DIETARY MANAGEMENT, 2019, : 25 - 31
  • [44] Linear Complexity Randomized Self-attention Mechanism
    Zheng, Lin
    Wang, Chong
    Kong, Lingpeng
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [45] Self-Attention Mechanism in GANs for Molecule Generation
    Chinnareddy, Sandeep
    Grandhi, Pranav
    Narayan, Apurva
    20TH IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA 2021), 2021, : 57 - 60
  • [46] Brain MR Image Segmentation Based on Spatial Self-attention Mechanism and Depth Feature Reconstruction
    Wei Y.
    Lin Z.-H.
    Qi L.
    Li B.-Q.
    Dongbei Daxue Xuebao/Journal of Northeastern University, 2023, 44 (02): : 177 - 185
  • [47] Attention and self-attention in random forests
    Lev V. Utkin
    Andrei V. Konstantinov
    Stanislav R. Kirpichenko
    Progress in Artificial Intelligence, 2023, 12 : 257 - 273
  • [48] Attention and self-attention in random forests
    Utkin, Lev V.
    Konstantinov, Andrei V.
    Kirpichenko, Stanislav R.
    PROGRESS IN ARTIFICIAL INTELLIGENCE, 2023, 12 (03) : 257 - 273
  • [49] Underwater image enhancement method based on a cross attention mechanism
    Sunhan Xu
    Jinhua Wang
    Ning He
    Xin Hu
    Fengxi Sun
    Multimedia Systems, 2024, 30
  • [50] Self-attention mechanism for distributed compressive sensing
    Shu, Feng
    Zhang, Linghua
    Ding, Yin
    Cheng, Qin
    Wang, Xu
    ELECTRONICS LETTERS, 2022, 58 (10) : 405 - 407