Deep Knowledge Tracing Model with an Evolved Transformer Structure

被引:0
|
作者
Li, Zhijun [1 ]
Xue, Zixiao [1 ]
Liu, Chen [1 ]
Feng, Yanzhang [1 ]
机构
[1] North China Univ Technol, Sch Elect & Control Engn, Beijing 100144, Peoples R China
关键词
Deep knowledge tracing; Transformer; Hybrid attention mechanism; Interpretability;
D O I
10.1109/DDCLS58216.2023.10167354
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep Learning based Knowledge Tracing (DKT) has become a research hotspot in the intelligence education field. Compared to conventional methods, DKT has better predictive performance, but it also has some problems such as poor interpretability, and difficulties in reflecting the causal association between the learning process and test results. In this paper, a new DKT model is proposed based on an evolved Transformer structure (DKT-ETS). The encoder layer is composed of three coding networks with a multi-head self-attention mechanism, while inputs are three types of pre-processed data: process characteristic data, test label data, and answer results data. Output as three matrices of V, Q, K. The decoder layer also uses the attention mechanism, in which the input is the three matrices that come from encoder, and the output is the predicted result. By improving the structure, the new model introduces certain interpretability into the V, Q and K matrices of the attention mechanism. Thus, the causal relationship between the learning process and test results can be reflected a certain extent: the V matrix represents the characteristic information of the testee's learning process; the Q matrix reflects the knowledge point information examined by the current test item; and the K matrix represents the results of the previous tests. DKT-ETS was validated by using the large-scale knowledge tracking data set EdNet, and the results show that its ACC and AUC evaluation indicators have been significantly improved.
引用
收藏
页码:1586 / 1592
页数:7
相关论文
共 50 条
  • [21] What is wrong with deep knowledge tracing? Attention-based knowledge tracing
    Wang, Xianqing
    Zheng, Zetao
    Zhu, Jia
    Yu, Weihao
    APPLIED INTELLIGENCE, 2023, 53 (03) : 2850 - 2861
  • [22] The Evolved Transformer
    So, David R.
    Liang, Chen
    Le, Quoc V.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [23] Evolutionary Neural Architecture Search for Transformer in Knowledge Tracing
    Yang, Shangshang
    Yu, Xiaoshan
    Tian, Ye
    Yan, Xueming
    Ma, Haiping
    Zhang, Xingyi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [24] BiDKT: Deep Knowledge Tracing with BERT
    Tan, Weicong
    Jin, Yuan
    Liu, Ming
    Zhang, He
    AD HOC NETWORKS AND TOOLS FOR IT, ADHOCNETS 2021, 2022, 428 : 260 - 278
  • [25] Deep Knowledge Tracing with Learning Curves
    Yang, Shanghui
    Liu, Xin
    Su, Hang
    Zhu, Mengxia
    Lu, Xuesong
    2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW, 2022, : 282 - 291
  • [26] Deep Knowledge Tracing On Programming Exercises
    Wang, Lisa
    Sy, Angela
    Liu, Larry
    Piech, Chris
    PROCEEDINGS OF THE FOURTH (2017) ACM CONFERENCE ON LEARNING @ SCALE (L@S'17), 2017, : 201 - 204
  • [27] Deep knowledge tracing with learning curves
    Su, Hang
    Liu, Xin
    Yang, Shanghui
    Lu, Xuesong
    FRONTIERS IN PSYCHOLOGY, 2023, 14
  • [28] Some Improvements of Deep Knowledge Tracing
    Tato, Ange
    Nkambou, Roger
    2019 IEEE 31ST INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2019), 2019, : 1520 - 1524
  • [29] Deep Knowledge Tracing and Engagement with MOOCs
    Mongkhonvanit, Kritphong
    Kanopka, Klint
    Lang, David
    PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON LEARNING ANALYTICS & KNOWLEDGE (LAK'19), 2019, : 340 - 342
  • [30] ADKT: Adaptive Deep Knowledge Tracing
    He, Liangliang
    Tang, Jintao
    Li, Xiao
    Wang, Ting
    WEB INFORMATION SYSTEMS ENGINEERING, WISE 2020, PT I, 2020, 12342 : 302 - 314