Exploring the Application of Large-Scale Pre-Trained Models on Adverse Weather Removal

被引:2
|
作者
Tan, Zhentao [1 ]
Wu, Yue [1 ]
Liu, Qiankun [2 ,3 ]
Chu, Qi [2 ,3 ]
Lu, Le [1 ]
Ye, Jieping [1 ]
Yu, Nenghai [2 ,3 ]
机构
[1] Alibaba Grp, Hangzhou 310052, Peoples R China
[2] Univ Sci & Technol China USTC, Sch Cyber Sci & Technol, CAS Key Lab Electromagnet Space Informat, Hefei 230026, Peoples R China
[3] Univ Sci & Technol China USTC, Anhui Prov Key Lab Digital Secur, Hefei 230026, Peoples R China
关键词
Meteorology; Task analysis; Training; Semantics; Image restoration; Rain; Feature extraction; Adverse weather removal; image restoration; multi-modal pre-trained model; NETWORK;
D O I
10.1109/TIP.2024.3368961
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Image restoration under adverse weather conditions (e.g., rain, snow, and haze) is a fundamental computer vision problem that has important implications for various downstream applications. Distinct from early methods that are specially designed for specific types of weather, recent works tend to simultaneously remove various adverse weather effects based on either spatial feature representation learning or semantic information embedding. Inspired by various successful applications incorporating large-scale pre-trained models (e.g., CLIP), in this paper, we explore their potential benefits for leveraging large-scale pre-trained models in this task based on both spatial feature representation learning and semantic information embedding aspects: 1) spatial feature representation learning, we design a Spatially Adaptive Residual (SAR) encoder to adaptively extract degraded areas. To facilitate training of this model, we propose a Soft Residual Distillation (CLIP-SRD) strategy to transfer spatial knowledge from CLIP between clean and adverse weather images; 2) semantic information embedding, we propose a CLIP Weather Prior (CWP) embedding module to enable the network to adaptively respond to different weather conditions. This module integrates the sample-specific weather priors extracted by the CLIP image encoder with the distribution-specific information (as learned by a set of parameters) and embeds these elements using a cross-attention mechanism. Extensive experiments demonstrate that our proposed method can achieve state-of-the-art performance under various and severe adverse weather conditions. The code will be made available.
引用
收藏
页码:1683 / 1698
页数:16
相关论文
共 50 条
  • [31] EBERT: A lightweight expression-enhanced large-scale pre-trained language model for mathematics education
    Duan, Zhiyi
    Gu, Hengnian
    Ke, Yuan
    Zhou, Dongdai
    KNOWLEDGE-BASED SYSTEMS, 2024, 300
  • [32] CURE: A deep learning framework pre-trained on large-scale patient data for treatment effect estimation
    Liu, Ruoqi
    Chen, Pin-Yu
    Zhang, Ping
    PATTERNS, 2024, 5 (06):
  • [33] Refining Pre-Trained Motion Models
    Sun, Xinglong
    Harley, Adam W.
    Guibas, Leonidas J.
    2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2024, 2024, : 4932 - 4938
  • [34] Efficiently Robustify Pre-Trained Models
    Jain, Nishant
    Behl, Harkirat
    Rawat, Yogesh Singh
    Vineet, Vibhav
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION, ICCV, 2023, : 5482 - 5492
  • [35] Pre-trained Models for Sonar Images
    Valdenegro-Toro, Matias
    Preciado-Grijalva, Alan
    Wehbe, Bilal
    OCEANS 2021: SAN DIEGO - PORTO, 2021,
  • [36] Automated Program Repair in the Era of Large Pre-trained Language Models
    Xia, Chunqiu Steven
    Wei, Yuxiang
    Zhang, Lingming
    2023 IEEE/ACM 45TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ICSE, 2023, : 1482 - 1494
  • [37] Pre-Trained Language Models and Their Applications
    Wang, Haifeng
    Li, Jiwei
    Wu, Hua
    Hovy, Eduard
    Sun, Yu
    ENGINEERING, 2023, 25 : 51 - 65
  • [38] Automated LOINC Standardization Using Pre-trained Large Language Models
    Tu, Tao
    Loreaux, Eric
    Chesley, Emma
    Lelkes, Adam D.
    Gamble, Paul
    Bellaiche, Mathias
    Seneviratne, Martin
    Chen, Ming-Jun
    MACHINE LEARNING FOR HEALTH, VOL 193, 2022, 193 : 343 - 355
  • [39] Large pre-trained models for treatment effect estimation: Are we there yet?
    Li, Sheng
    PATTERNS, 2024, 5 (06):
  • [40] SMT Solver Validation Empowered by Large Pre-trained Language Models
    Sun, Maolin
    Yang, Yibiao
    Wang, Yang
    Wen, Ming
    Jia, Haoxiang
    Zhou, Yuming
    2023 38TH IEEE/ACM INTERNATIONAL CONFERENCE ON AUTOMATED SOFTWARE ENGINEERING, ASE, 2023, : 1288 - 1300