PerFedRec plus plus : Enhancing Personalized Federated Recommendation with Self-Supervised Pre-Training

被引:1
|
作者
Luo, Sichun [1 ,2 ]
Xiao, Yuanzhang [3 ]
Zhang, Xinyi [4 ]
Liu, Yang [5 ]
Ding, Wenbo [6 ,7 ]
Song, Linqi [1 ,2 ]
机构
[1] City Univ Hong Kong, Dept Comp Sci, Hong Kong, Peoples R China
[2] City Univ Hong Kong, Shenzhen Res Inst, Shenzhen, Peoples R China
[3] Univ Hawaii Manoa, Hawaii Adv Wireless Technol Inst, Honolulu, HI USA
[4] Capital Univ Econ & Business, Dept Accounting, Beijing, Peoples R China
[5] Tsinghua Univ, Inst AI Ind Res, Beijing, Peoples R China
[6] Tsinghua Shenzhen Int Grad Sch, Inst Data & Informat, Shenzhen, Peoples R China
[7] Tsinghua Univ, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Federated learning; self-supervised learning; personalization; MATRIX FACTORIZATION; NEURAL-NETWORKS;
D O I
10.1145/3664927
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated recommendation systems employ federated learning techniques to safeguard user privacy by transmitting model parameters instead of raw user data between user devices and the central server. Nevertheless, the current federated recommender system faces three significant challenges: (1) data heterogeneity: the heterogeneity of users' attributes and local data necessitates the acquisition of personalized models to improve the performance of federated recommendation; (2) model performance degradation: the privacy-preserving protocol design in the federated recommendation, such as pseudo item labeling and differential privacy, would deteriorate the model performance; (3) communication bottleneck: the standard federated recommendation algorithm can have a high communication overhead. Previous studies have attempted to address these issues, but none have been able to solve them simultaneously. In this article, we propose a novel framework, named PerFedRec++, to enhance the personalized federated recommendation with self-supervised pre-training. Specifically, we utilize the privacy-preserving mechanism of federated recommender systems to generate two augmented graph views, which are used as contrastive tasks in self-supervised graph learning to pre-train the model. Pre-training enhances the performance of federated models by improving the uniformity of representation learning. Also, by providing a better initial state for federated training, pre-training makes the overall training converge faster, thus alleviating the heavy communication burden. We then construct a collaborative graph to learn the client representation through a federated graph neural network. Based on these learned representations, we cluster users into different user groups and learn personalized models for each cluster. Each user learns a personalized model by combining the global federated model, the cluster-level federated model, and its own fine-tuned local model. Experiments on three real-world datasets show that our proposed method achieves superior performance over existing methods.
引用
收藏
页数:24
相关论文
共 50 条
  • [1] Self-supervised ECG pre-training
    Liu, Han
    Zhao, Zhenbo
    She, Qiang
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2021, 70
  • [2] ENHANCING THE DOMAIN ROBUSTNESS OF SELF-SUPERVISED PRE-TRAINING WITH SYNTHETIC IMAGES
    Hassan, Mohamad N. C.
    Bhattacharya, Avigyan
    da Costa, Victor G. Turrisi
    Banerjee, Biplab
    Ricci, Elisa
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 5470 - 5474
  • [3] Self-supervised Pre-training of Text Recognizers
    Kiss, Martin
    Hradis, Michal
    DOCUMENT ANALYSIS AND RECOGNITION-ICDAR 2024, PT IV, 2024, 14807 : 218 - 235
  • [4] SignBERT plus : Hand-Model-Aware Self-Supervised Pre-Training for Sign Language Understanding
    Hu, Hezhen
    Zhao, Weichao
    Zhou, Wengang
    Li, Houqiang
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (09) : 11221 - 11239
  • [5] Self-supervised Pre-training for Mirror Detection
    Lin, Jiaying
    Lau, Rynson W. H.
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 12193 - 12202
  • [6] Self-supervised Pre-training for Nuclei Segmentation
    Haq, Mohammad Minhazul
    Huang, Junzhou
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2022, PT II, 2022, 13432 : 303 - 313
  • [7] EFFECTIVENESS OF SELF-SUPERVISED PRE-TRAINING FOR ASR
    Baevski, Alexei
    Mohamed, Abdelrahman
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 7694 - 7698
  • [8] Enhancing bowel sound recognition with self-attention and self-supervised pre-training
    Yu, Yansuo
    Zhang, Mingwu
    Xie, Zhennian
    Liu, Qiang
    PLOS ONE, 2024, 19 (12):
  • [9] Self-supervised graph neural network with pre-training generative learning for recommendation systems
    Min, Xin
    Li, Wei
    Yang, Jinzhao
    Xie, Weidong
    Zhao, Dazhe
    SCIENTIFIC REPORTS, 2022, 12 (01)
  • [10] Self-Supervised pre-training model based on Multi-view for MOOC Recommendation
    Tian, Runyu
    Cai, Juanjuan
    Li, Chuanzhen
    Wang, Jingling
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 252