Knowledge Distillation via Token-Level Relationship Graph Based on the Big Data Technologies

被引:3
|
作者
Zhang, Shuoxi [1 ]
Liu, Hanpeng [1 ]
He, Kun [1 ]
机构
[1] Huazhong Univ Sci & Technol, Sch Comp Sci & Technol, 1037 Luoyu Rd, Wuhan 430074, Hubei, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge distillation; Graph representation; Graph-based distillation; Big data technology; NEURAL-NETWORKS;
D O I
10.1016/j.bdr.2024.100438
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In the big data era, characterized by vast volumes of complex data, the efficiency of machine learning models is of utmost importance, particularly in the context of intelligent agriculture. Knowledge distillation (KD), a technique aimed at both model compression and performance enhancement, serves as a pivotal solution by distilling the knowledge from an elaborate model (teacher) to a lightweight, compact counterpart (student). However, the true potential of KD has not been fully explored. Existing approaches primarily focus on transferring instancelevel information by big data technologies, overlooking the valuable information embedded in token-level relationships, which may be particularly affected by the long-tail effects. To address the above limitations, we propose a novel method called Knowledge Distillation with Token-level Relationship Graph (TRG) that leverages token-wise relationships to enhance the performance of knowledge distillation. By employing TRG, the student model can effectively emulate higher-level semantic information from the teacher model, resulting in improved performance and mobile-friendly efficiency. To further enhance the learning process, we introduce a dynamic temperature adjustment strategy, which encourages the student model to capture the topology structure of the teacher model more effectively. We conduct experiments to evaluate the effectiveness of the proposed method against several state-of-the-art approaches. Empirical results demonstrate the superiority of TRG across various visual tasks, including those involving imbalanced data. Our method consistently outperforms the existing baselines, establishing a new state-of-the-art performance in the field of KD based on big data technologies.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] KGEx: Explaining Knowledge Graph Embeddings via Subgraph Sampling and Knowledge Distillation
    Baltatzis, Vasileios
    Costabello, Luca
    LEARNING ON GRAPHS CONFERENCE, VOL 231, 2023, 231
  • [32] Token-level Speaker Change Detection Using Speaker Difference and Speech Content via Continuous Integrate-and-fire
    Fan, Zhiyun
    Liang, Zhenlin
    Dong, Linhao
    Liu, Yi
    Zhou, Shiyu
    Cai, Meng
    Zhang, Jun
    Ma, Zejun
    Xu, Bo
    INTERSPEECH 2022, 2022, : 3749 - 3753
  • [33] SPARQL BASED KNOWLEDGE GRAPH SUMMARIZATION FOR BIO-MEDICAL BIG DATA
    Jiang, Y.
    BASIC & CLINICAL PHARMACOLOGY & TOXICOLOGY, 2018, 122 : 66 - 66
  • [34] Construction Method of Domain Knowledge Graph Based on Big Data-driven
    Wang, Ning
    Haihong, E.
    Song, Meina
    Wang, Yuan
    5TH INTERNATIONAL CONFERENCE ON INFORMATION MANAGEMENT (ICIM 2019), 2019, : 165 - 172
  • [35] Big Data and Knowledge Graph Based Fault Diagnosis for Electric Power Systems
    Zhou Y.
    Lin Z.
    Tu L.
    Song Y.
    Wu Z.
    EAI Endorsed Transactions on Industrial Networks and Intelligent Systems, 2022, 9 (32)
  • [36] Survey on Data Integration Technologies for Relational Data and Knowledge Graph
    Gao Y.-J.
    Ge C.-C.
    Guo Y.-X.
    Chen L.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (05): : 2365 - 2391
  • [37] Internet of Things and Big Data technologies as an opportunity for organizations based on Knowledge Management
    Domagala, Piotr
    2019 IEEE 10TH INTERNATIONAL CONFERENCE ON MECHANICAL AND INTELLIGENT MANUFACTURING TECHNOLOGIES (ICMIMT 2019), 2019, : 199 - 203
  • [38] Knowledge Graph for Solubility Big Data: Construction and Applications
    Xiao, Haiyang
    Yan, Ruomei
    Wu, Yan
    Guan, Lixin
    Li, Mengshan
    WILEY INTERDISCIPLINARY REVIEWS-DATA MINING AND KNOWLEDGE DISCOVERY, 2025, 15 (01)
  • [39] Dual-Level Knowledge Distillation via Knowledge Alignment and Correlation
    Ding, Fei
    Yang, Yin
    Hu, Hongxin
    Krovi, Venkat
    Luo, Feng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 2425 - 2435
  • [40] Industrial big data analysis strategy based on automatic data classification and interpretable knowledge graph
    Ren, Bingtao
    Wang, Chenchong
    Zhang, Yuqi
    Wei, Xiaolu
    Xu, Wei
    JOURNAL OF MATERIALS INFORMATICS, 2025, 5 (02):