Cluster-aware Semi-supervised Learning: Relational Knowledge Distillation Provably Learns Clustering

被引:0
|
作者
Dong, Yijun [1 ]
Miller, Kevin [2 ]
Lei, Qi [1 ,3 ]
Ward, Rachel [2 ]
机构
[1] NYU, Courant Inst Math Sci, New York, NY 10003 USA
[2] Univ Texas Austin, Oden Inst Computat Engn & Sci, Austin, TX USA
[3] NYU, Ctr Data Sci, New York, NY USA
关键词
MATRIX;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite the empirical success and practical significance of (relational) knowledge distillation that matches (the relations of) features between teacher and student models, the corresponding theoretical interpretations remain limited for various knowledge distillation paradigms. In this work, we take an initial step toward a theoretical understanding of relational knowledge distillation (RKD), with a focus on semi-supervised classification problems. We start by casting RKD as spectral clustering on a population-induced graph unveiled by a teacher model. Via a notion of clustering error that quantifies the discrepancy between the predicted and ground truth clusterings, we illustrate that RKD over the population provably leads to low clustering error. Moreover, we provide a sample complexity bound for RKD with limited unlabeled samples. For semi-supervised learning, we further demonstrate the label efficiency of RKD through a general framework of cluster-aware semi-supervised learning that assumes low clustering errors. Finally, by unifying data augmentation consistency regularization into this cluster-aware framework, we show that despite the common effect of learning accurate clusterings, RKD facilitates a "global" perspective through spectral clustering, whereas consistency regularization focuses on a "local" perspective via expansion.
引用
收藏
页数:33
相关论文
共 50 条
  • [41] Clustering Network Traffic Using Semi-Supervised Learning
    Krajewska, Antonina
    Niewiadomska-Szynkiewicz, Ewa
    ELECTRONICS, 2024, 13 (14)
  • [42] Scalable semi-supervised clustering by spectral kernel learning
    Baghshah, M. Soleymani
    Afsari, F.
    Shouraki, S. Bagheri
    Eslami, E.
    PATTERN RECOGNITION LETTERS, 2014, 45 : 161 - 171
  • [43] Local Clustering with Mean Teacher for Semi-supervised learning
    Chen, Zexi
    Dutton, Benjamin
    Ramachandra, Bharathkumar
    Wu, Tianfu
    Vatsavai, Ranga Raju
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 6243 - 6250
  • [44] Structured graph learning for clustering and semi-supervised classification
    Kang, Zhao
    Peng, Chong
    Cheng, Qiang
    Liu, Xinwang
    Peng, Xi
    Xu, Zenglin
    Tian, Ling
    PATTERN RECOGNITION, 2021, 110
  • [45] Iterative double clustering for unsupervised and semi-supervised learning
    El-Yaniv, R
    Souroujon, O
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 14, VOLS 1 AND 2, 2002, 14 : 1025 - 1032
  • [46] Semi-supervised learning made simple with self-supervised clustering
    Fini, Enrico
    Astolfi, Pietro
    Alahari, Karteek
    Alameda-Meda, Xavier
    Mairal, Julien
    Nabi, Moin
    Ricci, Elisa
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 3187 - 3197
  • [47] GFD-SSL: generative federated knowledge distillation-based semi-supervised learning
    Karami, Ali
    Ramezani, Reza
    Baraani Dastjerdi, Ahmad
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, 15 (12) : 5509 - 5529
  • [48] Semi-Supervised Learning with Mutual Distillation for Monocular Depth Estimation
    Baek, Jongbeom
    Kim, Gyeongnyeon
    Kim, Seungryong
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2022), 2022, : 4562 - 4569
  • [49] Learning to Cluster with Auxiliary Tasks: A Semi-Supervised Approach
    Figueroa, Jhosimar Arias
    Rivera, Adin Ramirez
    2017 30TH SIBGRAPI CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES (SIBGRAPI), 2017, : 141 - 148
  • [50] Uncertainty-Aware Distillation for Semi-Supervised Few-Shot Class-Incremental Learning
    Cui, Yawen
    Deng, Wanxia
    Chen, Haoyu
    Liu, Li
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (10) : 14259 - 14272