Knowledge distillation meets recommendation: collaborative distillation for top-N recommendation

被引:2
|
作者
Lee, Jae-woong [1 ]
Choi, Minjin [2 ]
Sael, Lee [3 ,4 ]
Shim, Hyunjung [5 ]
Lee, Jongwuk [2 ]
机构
[1] Sungkyunkwan Univ, Seoul, South Korea
[2] Sungkyunkwan Univ, Dept Comp Sci & Engn, Seoul, South Korea
[3] Ajou Univ, Dept Software & Comp Engn, Dept Artificial Intelligence, Seoul, South Korea
[4] Ajou Univ, Dept Convergence Healthcare Med, Seoul, South Korea
[5] Yonsei Univ, Sch Integrated Technol, Seoul, South Korea
基金
新加坡国家研究基金会;
关键词
Knowledge distillation; Top-N recommendation; Collaborative filtering; Data sparsity; Data ambiguity;
D O I
10.1007/s10115-022-01667-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge distillation (KD) is a successful method for transferring knowledge from one model (i.e., teacher model) to another model (i.e., student model). Despite the success of KD in classification tasks, applying KD to recommender models is challenging because of the sparsity of positive feedback, ambiguity of missing feedback, and ranking problem for top-N recommendation. In this paper, we propose a new KD model for collaborative filtering, namely collaborative distillation (CD). Specifically, (1) we reformulate a loss function to deal with the ambiguity of missing feedback. (2) We exploit probabilistic rank-aware sampling for top-N recommendation. (3) To train the proposed model effectively, we develop two training strategies for the student model, called teacher- and student-guided training methods, adaptively selecting the most beneficial feedback from the teacher model. Furthermore, we extend our model using self-distillation, called born-again CD (BACD). That is, the teacher and student models with the same model capacity are trained by using the proposed distillation method. The experimental results demonstrate that CD outperforms the state-of-the-art method by 2.7-33.2% and 2.7-29.9% in hit rate (HR) and normalized discounted cumulative gain (NDCG), respectively. Moreover, BACD improves the teacher model by 3.5-12.0% and 4.9-13.3% in HR and NDCG, respectively.
引用
收藏
页码:1323 / 1348
页数:26
相关论文
共 50 条
  • [1] Knowledge distillation meets recommendation: collaborative distillation for top-N recommendation
    Jae-woong Lee
    Minjin Choi
    Lee Sael
    Hyunjung Shim
    Jongwuk Lee
    Knowledge and Information Systems, 2022, 64 : 1323 - 1348
  • [2] Collaborative Distillation for Top-N Recommendation
    Lee, Jae-woong
    Choi, Minjin
    Lee, Jongwuk
    Shim, Hyunjung
    2019 19TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2019), 2019, : 369 - 378
  • [3] An Improved Top-N Recommendation for Collaborative Filtering
    Yang, Jiongxin
    Wang, Zhenyu
    SOCIAL MEDIA PROCESSING, SMP 2016, 2016, 669 : 233 - 244
  • [4] A novel Enhanced Collaborative Autoencoder with knowledge distillation for top-N recommender systems
    Pan, Yiteng
    He, Fazhi
    Yu, Haiping
    NEUROCOMPUTING, 2019, 332 : 137 - 148
  • [5] Top-N Collaborative Filtering Recommendation Algorithm Based on Knowledge Graph Embedding
    Zhu, Ming
    Zhen, De-sheng
    Tao, Ran
    Shi, You-qun
    Feng, Xiang-yang
    Wang, Qian
    KNOWLEDGE MANAGEMENT IN ORGANIZATIONS, KMO 2019, 2019, 1027 : 122 - 134
  • [6] Top-N Recommendation on Graphs
    Kang, Zhao
    Peng, Chong
    Yang, Ming
    Cheng, Qiang
    CIKM'16: PROCEEDINGS OF THE 2016 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2016, : 2101 - 2106
  • [7] An Analysis of Probabilistic Methods for Top-N Recommendation in Collaborative Filtering
    Barbieri, Nicola
    Manco, Giuseppe
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT I, 2011, 6911 : 172 - 187
  • [8] CKGAT: Collaborative Knowledge-Aware Graph Attention Network for Top-N Recommendation
    Xu, Zhuoming
    Liu, Hanlin
    Li, Jian
    Zhang, Qianqian
    Tang, Yan
    APPLIED SCIENCES-BASEL, 2022, 12 (03):
  • [9] Boolean kernels for collaborative filtering in top-N item recommendation
    Polato, Mirko
    Aiolli, Fabio
    NEUROCOMPUTING, 2018, 286 : 214 - 225
  • [10] Joint Collaborative Ranking with Social Relationships in Top-N Recommendation
    Rafailidis, Dimitrios
    Crestani, Fabio
    CIKM'16: PROCEEDINGS OF THE 2016 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2016, : 1393 - 1402