NaCL: noise-robust cross-domain contrastive learning for unsupervised domain adaptation

被引:0
|
作者
Jingzheng Li
Hailong Sun
机构
[1] Beihang University,SKLSDE Lab, School of Computer Science and Engineering
[2] Beihang University,SKLSDE Lab, School of Software
[3] Beijing Advanced Innovation Center for Big Data and Brain Computing,undefined
来源
Machine Learning | 2023年 / 112卷
关键词
Domain adaptation; Contrastive learning; Clustering;
D O I
暂无
中图分类号
学科分类号
摘要
The Unsupervised Domain Adaptation (UDA) methods aim to enhance feature transferability possibly at the expense of feature discriminability. Recently, contrastive representation learning has been applied to UDA as a promising approach. One way is to combine the mainstream domain adaptation method with contrastive self-supervised tasks. The other way uses contrastive learning to align class-conditional distributions according to the semantic structure information of source and target domains. Nevertheless, there are some limitations in two aspects. One is that optimal solutions for the contrastive self-supervised learning and the domain discrepancy minimization may not be consistent. The other is that contrastive learning uses pseudo label information of target domain to align class-conditional distributions, where the pseudo label information contains noise such that false positive and negative pairs would deteriorate the performance of contrastive learning. To address these issues, we propose Noise-robust cross-domain Contrastive Learning (NaCL) to directly realize the domain adaptation task via simultaneously learning the instance-wise discrimination and encoding semantic structures in intra- and inter-domain to the learned representation space. More specifically, we adopt topology-based selection on the target domain to detect and remove false positive and negative pairs in contrastive loss. Theoretically, we demonstrate that not only NaCL can be considered an example of Expectation Maximization (EM), but also accurate pseudo label information is beneficial for reducing the expected error on target domain. NaCL obtains superior results on three public benchmarks. Further, NaCL can also be applied to semi-supervised domain adaptation with only minor modifications, achieving advanced diagnostic performance on COVID-19 dataset. Code is available at https://github.com/jingzhengli/NaCL
引用
收藏
页码:3473 / 3496
页数:23
相关论文
共 50 条
  • [41] Collaborative contrastive learning for cross-domain gaze estimation
    Xia, Lifan
    Li, Yong
    Cai, Xin
    Cui, Zhen
    Xu, Chunyan
    Chan, Antoni B.
    PATTERN RECOGNITION, 2025, 161
  • [42] Noise-Robust Continual Test-Time Domain Adaptation
    Yu, Zhiqi
    Li, Jingjing
    Du, Zhekai
    Li, Fengling
    Zhu, Lei
    Yang, Yang
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 2654 - 2662
  • [43] DRANet: Disentangling Representation and Adaptation Networks for Unsupervised Cross-Domain Adaptation
    Lee, Seunghun
    Cho, Sunghyun
    Im, Sunghoon
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 15247 - 15256
  • [44] Time-Reversal Enhancement Network With Cross-Domain Information for Noise-Robust Speech Recognition
    Chao, Fu-An
    Hung, Jeih-Weih
    Sheu, Tommy
    Chen, Berlin
    IEEE MULTIMEDIA, 2022, 29 (01) : 114 - 124
  • [45] Unsupervised domain adaptation via causal-contrastive learning
    Xing Wei
    Wenhao Jiang
    Fan Yang
    Chong Zhao
    Yang Lu
    Benhong Zhang
    Xiang Bi
    The Journal of Supercomputing, 81 (5)
  • [46] Connect, Not Collapse: Explaining Contrastive Learning for Unsupervised Domain Adaptation
    Shen, Kendrick
    Jones, Robbie
    Kumar, Ananya
    Xie, Sang Michael
    HaoChen, Jeff Z.
    Ma, Tengyu
    Liang, Percy
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022, : 19847 - 19878
  • [47] Task-oriented contrastive learning for unsupervised domain adaptation
    Wei, Xing
    Wen, Bin
    Yang, Fan
    Liu, Yujie
    Zhao, Chong
    Hu, Di
    Luo, Hui
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 229
  • [48] Crowd Counting via Unsupervised Cross-Domain Feature Adaptation
    Ding, Guanchen
    Yang, Daiqin
    Wang, Tao
    Wang, Sihan
    Zhang, Yunfei
    IEEE TRANSACTIONS ON MULTIMEDIA, 2023, 25 : 4665 - 4678
  • [49] Balanced Adaptation Regularization Based Transfer Learning for Unsupervised Cross-Domain Fault Diagnosis
    Hu, Qin
    Si, Xiaosheng
    Qin, Aisong
    Lv, Yunrong
    Liu, Mei
    IEEE SENSORS JOURNAL, 2022, 22 (12) : 12139 - 12151
  • [50] Unsupervised Energy-based Adversarial Domain Adaptation for Cross-domain Text Classification
    Zou, Han
    Yang, Jianfei
    Wu, Xiaojian
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 1208 - 1218