NaCL: noise-robust cross-domain contrastive learning for unsupervised domain adaptation

被引:0
|
作者
Jingzheng Li
Hailong Sun
机构
[1] Beihang University,SKLSDE Lab, School of Computer Science and Engineering
[2] Beihang University,SKLSDE Lab, School of Software
[3] Beijing Advanced Innovation Center for Big Data and Brain Computing,undefined
来源
Machine Learning | 2023年 / 112卷
关键词
Domain adaptation; Contrastive learning; Clustering;
D O I
暂无
中图分类号
学科分类号
摘要
The Unsupervised Domain Adaptation (UDA) methods aim to enhance feature transferability possibly at the expense of feature discriminability. Recently, contrastive representation learning has been applied to UDA as a promising approach. One way is to combine the mainstream domain adaptation method with contrastive self-supervised tasks. The other way uses contrastive learning to align class-conditional distributions according to the semantic structure information of source and target domains. Nevertheless, there are some limitations in two aspects. One is that optimal solutions for the contrastive self-supervised learning and the domain discrepancy minimization may not be consistent. The other is that contrastive learning uses pseudo label information of target domain to align class-conditional distributions, where the pseudo label information contains noise such that false positive and negative pairs would deteriorate the performance of contrastive learning. To address these issues, we propose Noise-robust cross-domain Contrastive Learning (NaCL) to directly realize the domain adaptation task via simultaneously learning the instance-wise discrimination and encoding semantic structures in intra- and inter-domain to the learned representation space. More specifically, we adopt topology-based selection on the target domain to detect and remove false positive and negative pairs in contrastive loss. Theoretically, we demonstrate that not only NaCL can be considered an example of Expectation Maximization (EM), but also accurate pseudo label information is beneficial for reducing the expected error on target domain. NaCL obtains superior results on three public benchmarks. Further, NaCL can also be applied to semi-supervised domain adaptation with only minor modifications, achieving advanced diagnostic performance on COVID-19 dataset. Code is available at https://github.com/jingzhengli/NaCL
引用
收藏
页码:3473 / 3496
页数:23
相关论文
共 50 条
  • [1] NaCL: noise-robust cross-domain contrastive learning for unsupervised domain adaptation
    Li, Jingzheng
    Sun, Hailong
    MACHINE LEARNING, 2023, 112 (09) : 3473 - 3496
  • [2] Cross-Domain Contrastive Learning for Unsupervised Domain Adaptation
    Wang, Rui
    Wu, Zuxuan
    Weng, Zejia
    Chen, Jingjing
    Qi, Guo-Jun
    Jiang, Yu-Gang
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 1665 - 1673
  • [3] Robust Cross-Domain Pseudo-Labeling and Contrastive Learning for Unsupervised Domain Adaptation NIR-VIS Face Recognition
    Yang, Yiming
    Hu, Weipeng
    Lin, Haiqi
    Hu, Haifeng
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 5231 - 5244
  • [4] Learning cross-domain representations by vision transformer for unsupervised domain adaptation
    Ye, Yifan
    Fu, Shuai
    Chen, Jing
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (15): : 10847 - 10860
  • [5] Domain Confused Contrastive Learning for Unsupervised Domain Adaptation
    Long, Quanyu
    Luo, Tianze
    Wang, Wenya
    Pan, Sinno Jialin
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 2982 - 2995
  • [6] Cross-domain feature enhancement for unsupervised domain adaptation
    Long Sifan
    Wang Shengsheng
    Zhao Xin
    Fu Zihao
    Wang Bilin
    Applied Intelligence, 2022, 52 : 17326 - 17340
  • [7] Cross-Domain Error Minimization for Unsupervised Domain Adaptation
    Du, Yuntao
    Chen, Yinghao
    Cui, Fengli
    Zhang, Xiaowen
    Wang, Chongjun
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS (DASFAA 2021), PT II, 2021, 12682 : 429 - 448
  • [8] Cross-domain feature enhancement for unsupervised domain adaptation
    Sifan, Long
    Shengsheng, Wang
    Xin, Zhao
    Zihao, Fu
    Bilin, Wang
    APPLIED INTELLIGENCE, 2022, 52 (15) : 17326 - 17340
  • [9] Unsupervised Domain Adaptation with Imbalanced Cross-Domain Data
    Hsu, Tzu-Ming Harry
    Chen, Wei-Yu
    Hou, Cheng-An
    Tsai, Yao-Hung Hubert
    Yeh, Yi-Ren
    Wang, Yu-Chiang Frank
    2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 4121 - 4129
  • [10] Unsupervised domain adaptation by cross-domain consistency learning for CT body composition
    Ali, Shahzad
    Lee, Yu Rim
    Park, Soo Young
    Tak, Won Young
    Jung, Soon Ki
    MACHINE VISION AND APPLICATIONS, 2025, 36 (01)