A Pre-trained Knowledge Tracing Model with Limited Data

被引:0
|
作者
Yue, Wenli [1 ,3 ]
Su, Wei [1 ,3 ]
Liu, Lei [2 ]
Cai, Chuan [1 ]
Yuan, Yongna [1 ]
Jia, Zhongfeng [1 ]
Liu, Jiamin [1 ]
Xie, Wenjian [1 ]
机构
[1] Lanzhou Univ, Sch Informat Sci & Engn, Lanzhou, Peoples R China
[2] Duzhe Publishing Grp Co Ltd, Lanzhou, Peoples R China
[3] Key Lab Media Convergence Technol & Commun, Lanzhou, Gansu, Peoples R China
关键词
Knowledge Tracing; Limited Data; Pre-training; Fine-tuning;
D O I
10.1007/978-3-031-68309-1_14
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Online education systems have gained increasing popularity due to their capability to fully preserve users' learning data. This advantage enables researchers to assess learners' mastery through their learning trajectories, thereby facilitating personalized education and support. Knowledge tracing, an effective educational aid, simulates students' implicit knowledge states and predicts their mastery over knowledge based on their historical answer records. However, for newly developed online learning platforms, the lack of sufficient historical answer data may impede accurate prediction of students' knowledge states, rendering existing knowledge tracing models less effective. This paper introduces the first pre-trained knowledge tracing model that leverages a substantial amount of existing data for pre-training and a smaller dataset for fine-tuning. Validated across several publicly available knowledge tracing datasets, our method demonstrates significant improvement in tracing performance on small datasets, with a maximum AUC increase of 5.07%. Beyond incorporating small datasets, our approach of pre-training the entire dataset has shown an enhanced AUC compared to the baseline, marking a novel direction in knowledge tracing research. Furthermore, the paper analyzed the outcomes of pre-training experiments with varying numbers of interactions as fine-tuning datasets, providing valuable insights for Intelligent Tutoring Systems (ITS).
引用
收藏
页码:163 / 178
页数:16
相关论文
共 50 条
  • [21] A novel fusion framework for sequential data using pre-trained model
    Ruan, Tao
    Jin, Canghong
    Xu, Lei
    Ding, Jianchao
    Ying, Shengyu
    Wu, Minghui
    Li, Huanqiang
    IAENG International Journal of Computer Science, 2020, 47 (03) : 593 - 598
  • [22] Model Based Reinforcement Learning Pre-Trained with Various State Data
    Ono, Masaaki
    Ichise, Ryutaro
    2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 918 - 925
  • [23] Billion-scale pre-trained knowledge graph model for conversational chatbot
    Wong, Chi-Man
    Feng, Fan
    Zhang, Wen
    Chen, Huajun
    Vong, Chi-Man
    Chen, Chuangquan
    NEUROCOMPUTING, 2024, 606
  • [24] Explainable reasoning over temporal knowledge graphs by pre-trained language model
    Li, Qing
    Wu, Guanzhong
    INFORMATION PROCESSING & MANAGEMENT, 2025, 62 (01)
  • [25] A Pre-trained Universal Knowledge Graph Reasoning Model Based on Rule Prompts
    Cui, Yuanning
    Sun, Zequn
    Hu, Wei
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2024, 61 (08): : 2030 - 2044
  • [26] Prompting disentangled embeddings for knowledge graph completion with pre-trained language model
    Geng, Yuxia
    Chen, Jiaoyan
    Zeng, Yuhang
    Chen, Zhuo
    Zhang, Wen
    Pan, Jeff Z.
    Wang, Yuxiang
    Xu, Xiaoliang
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 268
  • [27] NMT Enhancement based on Knowledge Graph Mining with Pre-trained Language Model
    Yang, Hao
    Qin, Ying
    Deng, Yao
    Wang, Minghan
    2020 22ND INTERNATIONAL CONFERENCE ON ADVANCED COMMUNICATION TECHNOLOGY (ICACT): DIGITAL SECURITY GLOBAL AGENDA FOR SAFE SOCIETY!, 2020, : 185 - 189
  • [28] Vietnamese Sentence Paraphrase Identification using Pre-trained Model and Linguistic Knowledge
    Dien Dinh
    Nguyen Le Thanh
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (08) : 796 - 806
  • [29] CaseEncoder: A Knowledge-enhanced Pre-trained Model for Legal Case Encoding
    Ma, Yixiao
    Wu, Yueyue
    Su, Weihang
    Ai, Qingyao
    Liu, Yiqun
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING, EMNLP 2023, 2023, : 7134 - 7143
  • [30] Probing Pre-Trained Language Models for Disease Knowledge
    Alghanmi, Israa
    Espinosa-Anke, Luis
    Schockaert, Steven
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 3023 - 3033