Self-attention in Knowledge Tracing: Why It Works

被引:3
|
作者
Pu, Shi [1 ]
Becker, Lee [1 ]
机构
[1] Educ Testing Serv, 660 Rosedale Rd, Princeton, NJ 08540 USA
来源
ARTIFICIAL INTELLIGENCE IN EDUCATION, PT I | 2022年 / 13355卷
关键词
Deep knowledge tracing; Self-attention; Knowledge tracing;
D O I
10.1007/978-3-031-11644-5_75
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge tracing refers to the dynamic assessment of a learner's mastery of skills. There has been widespread adoption of the self-attention mechanism in knowledge-tracing models in recent years. These models consistently report performance gains over baseline knowledge tracing models in public datasets. However, why the self-attention mechanism works in knowledge tracing is unknown. This study argues that the ability to encode when a learner attempts to answer the same item multiple times in a row (henceforth referred to as repeated attempts) is a significant reason why self-attention models perform better than other deep knowledge tracing models. We present two experiments to support our argument. We use context-aware knowledge tracing (AKT) as our example self-attention model and dynamic key-value memory networks (DKVMN) and deep performance factors analysis (DPFA) as our baseline models. Firstly, we show that removing repeated attempts from datasets closes the performance gap between the AKT and the baseline models. Secondly, we present DPFA+, an extension of DPFA that is able to consume manually crafted repeated attempts features. We demonstrate that DPFA+ performs better than AKT across all datasets with manually crafted repeated attempts features.
引用
收藏
页码:731 / 736
页数:6
相关论文
共 50 条
  • [41] Anisotropy Is Inherent to Self-Attention in Transformers
    Godey, Nathan
    de la Clergerie, Eric
    Sagot, Benoit
    PROCEEDINGS OF THE 18TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 35 - 48
  • [42] Self-attention Hypergraph Pooling Network
    Zhao Y.-F.
    Jin F.-S.
    Li R.-H.
    Qin H.-C.
    Cui P.
    Wang G.-R.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (10):
  • [43] Self-Attention Based Video Summarization
    Li Y.
    Wang J.
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2020, 32 (04): : 652 - 659
  • [44] Self-Attention Technology in Image Segmentation
    Cao, Fude
    Lu, Xueyun
    INTERNATIONAL CONFERENCE ON INTELLIGENT TRAFFIC SYSTEMS AND SMART CITY (ITSSC 2021), 2022, 12165
  • [45] Relative molecule self-attention transformer
    Maziarka, Lukasz
    Majchrowski, Dawid
    Danel, Tomasz
    Gainski, Piotr
    Tabor, Jacek
    Podolak, Igor
    Morkisz, Pawel
    Jastrzebski, Stanislaw
    JOURNAL OF CHEMINFORMATICS, 2024, 16 (01)
  • [46] Deformable Self-Attention for Text Classification
    Ma, Qianli
    Yan, Jiangyue
    Lin, Zhenxi
    Yu, Liuhong
    Chen, Zipeng
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2021, 29 : 1570 - 1581
  • [47] The emergence of clusters in self-attention dynamics
    Geshkovski, Borjan
    Letrouit, Cyril
    Polyanskiy, Yury
    Rigollet, Philippe
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [48] Self-Attention with Structural Position Representations
    Wang, Xing
    Tu, Zhaopeng
    Wang, Longyue
    Shi, Shuming
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 1403 - 1409
  • [49] SELF-ATTENTION FOR INCOMPLETE UTTERANCE REWRITING
    Zhang, Yong
    Li, Zhitao
    Wang, Jianzong
    Cheng, Ning
    Xiao, Jing
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 8047 - 8051
  • [50] Overcoming a Theoretical Limitation of Self-Attention
    Chiang, David
    Cholak, Peter
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 7654 - 7664