Federated transfer learning with consensus knowledge distillation for intelligent fault diagnosis under data privacy preserving

被引:1
|
作者
Xue, Xingan [1 ]
Zhao, Xiaoping [2 ]
Zhang, Yonghong [1 ]
Ma, Mengyao [2 ]
Bu, Can [3 ]
Peng, Peng [2 ]
机构
[1] Nanjing Univ Informat Sci & Technol, Sch Automat, Nanjing 210044, Peoples R China
[2] Nanjing Univ Informat Sci & Technol, Sch Comp Sci, Nanjing 210044, Peoples R China
[3] Nanjing Normal Univ, Sch Elect & Automat Engn, Nanjing 210023, Peoples R China
基金
中国国家自然科学基金;
关键词
fault diagnosis; federated learning; transfer learning; consensus knowledge distillation; mutual information regularization; ROTATING MACHINERY;
D O I
10.1088/1361-6501/acf77d
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Fault diagnosis with deep learning has garnered substantial research. However, the establishment of a model is contingent upon a volume of data. Moreover, centralizing the data from each device faces the problem of privacy leakage. Federated learning can cooperate with each device to form a global model without violating data privacy. Due to the data distribution discrepancy for each device, a global model trained only by the source client with labeled data fails to match the target client without labeled data. To overcome this issue, this research suggests a federated transfer learning method. A consensus knowledge distillation is adopted to train the extended target domain model. A mutual information regularization is introduced to further learn the structure information of the target client data. The source client and the extended target models are aggregated to improve model performance. The experimental results demonstrate that our method has broad application prospects.
引用
收藏
页数:15
相关论文
共 50 条
  • [21] FedGKD: Federated Graph Knowledge Distillation for privacy-preserving rumor detection
    Zheng, Peng
    Dou, Yong
    Yan, Yeqing
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [22] Privacy-Preserving Heterogeneous Personalized Federated Learning With Knowledge
    Pan, Yanghe
    Su, Zhou
    Ni, Jianbing
    Wang, Yuntao
    Zhou, Jinhao
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (06): : 5969 - 5982
  • [23] Privacy-preserving federated learning with non-transfer learning
    Xu M.
    Li X.
    Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2023, 50 (04): : 89 - 99
  • [24] A Personalized Federated Learning Method Based on Knowledge Distillation and Differential Privacy
    Jiang, Yingrui
    Zhao, Xuejian
    Li, Hao
    Xue, Yu
    ELECTRONICS, 2024, 13 (17)
  • [25] Privacy-preserving gradient boosting tree: Vertical federated learning for collaborative bearing fault diagnosis
    Xia, Liqiao
    Zheng, Pai
    Li, Jinjie
    Tang, Wangchujun
    Zhang, Xiangying
    IET COLLABORATIVE INTELLIGENT MANUFACTURING, 2022, 4 (03) : 208 - 219
  • [26] Selective knowledge sharing for privacy-preserving federated distillation without a good teacher
    Shao, Jiawei
    Wu, Fangzhao
    Zhang, Jun
    NATURE COMMUNICATIONS, 2024, 15 (01)
  • [27] Class-Imbalance Privacy-Preserving Federated Learning for Decentralized Fault Diagnosis With Biometric Authentication
    Lu, Shixiang
    Gao, Zhiwei
    Xu, Qifa
    Jiang, Cuixia
    Zhang, Aihua
    Wang, Xiangxiang
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2022, 18 (12) : 9101 - 9111
  • [28] Selective knowledge sharing for privacy-preserving federated distillation without a good teacher
    Jiawei Shao
    Fangzhao Wu
    Jun Zhang
    Nature Communications, 15
  • [29] Intelligent fault diagnosis via ring-based decentralized federated transfer learning
    Wan, Lanjun
    Ning, Jiaen
    Li, Yuanyuan
    Li, Changyun
    Li, Keqin
    KNOWLEDGE-BASED SYSTEMS, 2024, 284
  • [30] Federated contrastive prototype learning: An efficient collaborative fault diagnosis method with data privacy
    Wang, Rui
    Huang, Weiguo
    Zhang, Xiao
    Wang, Jun
    Ding, Chuancang
    Shen, Changqing
    KNOWLEDGE-BASED SYSTEMS, 2023, 281