Semi-Supervised Domain Adaptation via Asymmetric Joint Distribution Matching

被引:22
|
作者
Chen, Sentao [1 ]
Harandi, Mehrtash [2 ,3 ]
Jin, Xiaona [1 ]
Yang, Xiaowei [1 ]
机构
[1] South China Univ Technol, Sch Software Engn, Guangzhou 510006, Peoples R China
[2] Monash Univ, Dept Elect & Comp Syst Engn, Clayton, Vic 3800, Australia
[3] Data61 CSIRO, Canberra, ACT 2601, Australia
基金
中国博士后科学基金; 中国国家自然科学基金;
关键词
Manifolds; Optimization; Adaptation models; Predictive models; Data models; Least mean squares methods; Kernel; Feature mapping; joint distribution matching; Riemannian optimization; semi-supervised domain adaptation (SSDA);
D O I
10.1109/TNNLS.2020.3027364
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
An intrinsic problem in domain adaptation is the joint distribution mismatch between the source and target domains. Therefore, it is crucial to match the two joint distributions such that the source domain knowledge can be properly transferred to the target domain. Unfortunately, in semi-supervised domain adaptation (SSDA) this problem still remains unsolved. In this article, we therefore present an asymmetric joint distribution matching (AJDM) approach, which seeks a couple of asymmetric matrices to linearly match the source and target joint distributions under the relative chi-square divergence. Specifically, we introduce a least square method to estimate the divergence, which is free from estimating the two joint distributions. Furthermore, we show that our AJDM approach can be generalized to a kernel version, enabling it to handle nonlinearity in the data. From the perspective of Riemannian geometry, learning the linear and nonlinear mappings are both formulated as optimization problems defined on the product of Riemannian manifolds. Numerical experiments on synthetic and real-world data sets demonstrate the effectiveness of the proposed approach and testify its superiority over existing SSDA techniques.
引用
收藏
页码:5708 / 5722
页数:15
相关论文
共 50 条
  • [41] Learning with Augmented Features for Supervised and Semi-Supervised Heterogeneous Domain Adaptation
    Li, Wen
    Duan, Lixin
    Xu, Dong
    Tsang, Ivor W.
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2014, 36 (06) : 1134 - 1148
  • [42] Domain-invariant Graph for Adaptive Semi-supervised Domain Adaptation
    Li, Jinfeng
    Liu, Weifeng
    Zhou, Yicong
    Yu, Jun
    Tao, Dapeng
    Xu, Changsheng
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2022, 18 (03)
  • [43] Cross-Domain Adaptive Clustering for Semi-Supervised Domain Adaptation
    Li, Jichang
    Li, Guanbin
    Shi, Yemin
    Yu, Yizhou
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 2505 - 2514
  • [44] Improved Knowledge Transfer for Semi-supervised Domain Adaptation via Trico Training Strategy
    Ngo, Ba Hung
    Chae, Yeon Jeong
    Kwon, Jung Eun
    Park, Jae Hyeon
    Cho, Sung In
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 19157 - 19166
  • [45] Semi-supervised Domain Adaptation via Sample-to-Sample Self-Distillation
    Yoon, Jeongbeen
    Kang, Dahyun
    Cho, Minsu
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 1686 - 1695
  • [46] Semi-supervised Semantic Matching
    Laskar, Zakaria
    Kannala, Juho
    COMPUTER VISION - ECCV 2018 WORKSHOPS, PT III, 2019, 11131 : 444 - 455
  • [47] HSSDA: Hierarchical relation aided Semi-Supervised Domain Adaptation
    Guo, Xiechao
    Liu, Ruiping
    Song, Dandan
    AI OPEN, 2022, 3 : 156 - 161
  • [48] Information filtering and interpolating for semi-supervised graph domain adaptation
    Qiao, Ziyue
    Xiao, Meng
    Guo, Weiyu
    Luo, Xiao
    Xiong, Hui
    PATTERN RECOGNITION, 2024, 153
  • [50] Semi-Supervised Learning and Domain Adaptation in Natural Language Processing
    Foster, George
    COMPUTATIONAL LINGUISTICS, 2014, 40 (02) : 519 - 522