Calibrated Q-Matrix-Enhanced Deep Knowledge Tracing with Relational Attention Mechanism

被引:11
|
作者
Li, Linqing [1 ]
Wang, Zhifeng [2 ]
机构
[1] Cent China Normal Univ, Cent China Normal Univ Wollongong Joint Inst, Wuhan 430079, Peoples R China
[2] Cent China Normal Univ, Fac Artificial Intelligence Educ, Wuhan 430079, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 04期
基金
中国国家自然科学基金;
关键词
knowledge tracing; attention mechanism; relation modeling; calibrated Q-matrix;
D O I
10.3390/app13042541
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
With the development of online educational platforms, numerous research works have focused on the knowledge tracing task, which relates to the problem of diagnosing the changing knowledge proficiency of learners. Deep-neural-network-based models are used to explore the interaction information between students and their answer logs in the current field of knowledge tracing studies. However, those models ignore the impact of previous interactions, including the exercise relation, forget factor, and student behaviors (the slipping factor and the guessing factor). Those models also do not consider the importance of the Q-matrix, which relates exercises to knowledge points. In this paper, we propose a novel relational attention knowledge tracing (RAKT) to track the students' knowledge proficiency in exercises. Specifically, the RAKT model incorporates the students' performance data with corresponding interaction information, such as the context of exercises and the different time intervals between exercises. The RAKT model also takes into account the students' interaction behaviors, including the slipping factor and the guessing factor. Moreover, consider the relationship between exercise sets and knowledge sets and the relationship between different knowledge points in the same exercise. An extension model of RAKT is called the Calibrated Q-matrix relational attention knowledge tracing model (QRAKT), which was developed using a Q-matrix calibration method based on the hierarchical knowledge levels. Experiments were conducted on two public educational datasets, ASSISTment2012 and Eedi. The results of the experiments indicated that the RAKT model and the QRAKT model outperformed the four baseline models.
引用
收藏
页数:24
相关论文
共 50 条
  • [21] Lightweight deep hybrid CNN with attention mechanism for enhanced underwater image restoration
    Karthikeyan, V.
    Praveen, S.
    Nandan, S. Sudeep
    VISUAL COMPUTER, 2025,
  • [22] Enhanced Hybrid Intrusion Detection System with Attention Mechanism using Deep Learning
    Chavan P.
    Hanumanthappa H.
    Satish E.G.
    Manoli S.
    Supreeth S.
    Rohith S.
    Ramaprasad H.C.
    SN Computer Science, 5 (5)
  • [23] DKVMN&MRI: A new deep knowledge tracing model based on DKVMN incorporating multi-relational information
    Xu, Feng
    Chen, Kang
    Zhong, Maosheng
    Liu, Lei
    Liu, Huizhu
    Luo, Xianzeng
    Zheng, Lang
    PLOS ONE, 2024, 19 (10):
  • [24] Robotic Pushing and Grasping Knowledge Learning via Attention Deep Q-learning Network
    Yang, Zipeng
    Shang, Huiliang
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT (KSEM 2020), PT I, 2020, 12274 : 223 - 234
  • [25] Vision-based attention deep q-network with prior-based knowledge
    Ma, Jialin
    Li, Ce
    Hong, Liang
    Wei, Kailun
    Zhao, Shutian
    Jiang, Hangfei
    Qu, Yanyun
    APPLIED INTELLIGENCE, 2025, 55 (06)
  • [26] Deep fusion of human-machine knowledge with attention mechanism for breast cancer diagnosis
    Luo, Yaozhong
    Lu, Zhenkun
    Liu, Longzhong
    Huang, Qinghua
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 84
  • [27] Incorporating Graph Attention Mechanism into Knowledge Graph Reasoning Based on Deep Reinforcement Learning
    Wang, Heng
    Li, Shuangyin
    Pan, Rong
    Mao, Mingzhi
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 2623 - 2631
  • [28] An attention mechanism and residual network based knowledge graph-enhanced recommender system
    Li, Weisheng
    Zhong, Hao
    Zhou, Junming
    Chang, Chao
    Lin, Ronghua
    Tang, Yong
    KNOWLEDGE-BASED SYSTEMS, 2024, 299
  • [29] Video Q &A based on two-stage deep exploration of temporally-evolving features with enhanced cross-modal attention mechanism
    Luo, Yuanmao
    Wang, Ruomei
    Zhang, Fuwei
    Zhou, Fan
    Liu, Mingyang
    Feng, Jiawei
    NEURAL COMPUTING & APPLICATIONS, 2024, : 8055 - 8071
  • [30] A knowledge graph embedding model based attention mechanism for enhanced node information integration
    Liu, Ying
    Wang, Peng
    Yang, Di
    Qiu, Ningjia
    PEERJ COMPUTER SCIENCE, 2024, 10