FedBiKD: Federated Bidirectional Knowledge Distillation for Distracted Driving Detection

被引:11
|
作者
Shang, Ertong [1 ]
Liu, Hui [1 ]
Yang, Zhuo [1 ]
Du, Junzhao [1 ]
Ge, Yiming [1 ]
机构
[1] Xidian Univ, Sch Comp Sci & Technol, Xian 710071, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep neural networks; distracted driving detection; federated learning (FL); knowledge distillation; DRIVER DETECTION;
D O I
10.1109/JIOT.2023.3243622
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Distracted driving behavior is known as a leading factor in road traffic injuries and deaths. Fortunately, rapidly developing deep learning technology has shown its potential in distracted driving detection. Nevertheless, deep learning-based solutions need to collect large amounts of driving data captured by camera sensors in the vehicle, which will cause serious privacy concerns. As a privacy-preserving distributed learning paradigm, federated learning (FL) has achieved competitive performance in many applications recently. Inspired by this, we introduce FL into distracted driving detection tasks. However, we observe that the heterogeneous data distribution across drivers leads to significant performance degradation of the model learned in FL. To address this challenge, we propose a simple and effective federated bidirectional knowledge distillation framework, FedBiKD. Specifically, FedBiKD utilizes the knowledge from the global model in guiding local training to mitigate the issue of local deviation. Meanwhile, the consensus from the ensemble of local models is also employed to fine-tune the aggregated global model for less volatility in training. Our extensive experiments demonstrate the effectiveness of FedBiKD in distracted driving detection. The results show that FedBiKD significantly outperforms other FL algorithms in terms of accuracy, communication efficiency, convergence rate, and stability.
引用
收藏
页码:11643 / 11654
页数:12
相关论文
共 50 条
  • [41] Distracted driving detection based on the improved CenterNet with attention mechanism
    Zhang, Qingqing
    Zhu, Zhongjie
    Bai, Yongqiang
    Liao, Guanglong
    Liu, Tingna
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (06) : 7993 - 8005
  • [42] Distracted Driving Detection Utilizing Wearable-based Bluetooth
    Mewborne, Travis
    Lee, Youngone
    Tan, Sheng
    Yang, Jie
    2022 IEEE 19TH INTERNATIONAL CONFERENCE ON MOBILE AD HOC AND SMART SYSTEMS (MASS 2022), 2022, : 485 - 490
  • [43] Dual knowledge distillation for bidirectional neural machine translation
    Zhang, Huaao
    Qiu, Shigui
    Wu, Shilong
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [44] Knowledge Selection and Local Updating Optimization for Federated Knowledge Distillation With Heterogeneous Models
    Wang, Dong
    Zhang, Naifu
    Tao, Meixia
    Chen, Xu
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2023, 17 (01) : 82 - 97
  • [45] Efficient Federated Learning for AIoT Applications Using Knowledge Distillation
    Liu, Tian
    Xia, Jun
    Ling, Zhiwei
    Fu, Xin
    Yu, Shui
    Chen, Mingsong
    IEEE INTERNET OF THINGS JOURNAL, 2023, 10 (08) : 7229 - 7243
  • [46] Fedadkd:heterogeneous federated learning via adaptive knowledge distillation
    Song, Yalin
    Liu, Hang
    Zhao, Shuai
    Jin, Haozhe
    Yu, Junyang
    Liu, Yanhong
    Zhai, Rui
    Wang, Longge
    PATTERN ANALYSIS AND APPLICATIONS, 2024, 27 (04)
  • [47] A Decentralized Federated Learning Based on Node Selection and Knowledge Distillation
    Zhou, Zhongchang
    Sun, Fenggang
    Chen, Xiangyu
    Zhang, Dongxu
    Han, Tianzhen
    Lan, Peng
    MATHEMATICS, 2023, 11 (14)
  • [48] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    NATURE COMMUNICATIONS, 2022, 13 (01)
  • [49] Mitigation of Membership Inference Attack by Knowledge Distillation on Federated Learning
    Ueda, Rei
    Nakai, Tsunato
    Yoshida, Kota
    Fujino, Takeshi
    IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 2025, E108A (03) : 267 - 279
  • [50] Preservation of the Global Knowledge by Not-True Distillation in Federated Learning
    Lee, Gihun
    Jeong, Minchan
    Shin, Yongjin
    Bae, Sangmin
    Yun, Se-Young
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,