Group-Level Cognitive Diagnosis: A Multi-Task Learning Perspective

被引:8
|
作者
Huang, Jie [1 ]
Liu, Qi [1 ]
Wang, Fei [1 ]
Huang, Zhenya [1 ]
Fang, Songtao [1 ]
Wu, Runze [2 ]
Chen, Enhong [1 ]
Su, Yu [1 ,3 ]
Wang, Shijin [3 ]
机构
[1] Univ Sci & Technol China, Anhui Prov Key Lab Big Data Anal & Applicat Sch C, Hefei, Anhui, Peoples R China
[2] NetEase Inc, Fuxi AI Lab, Hangzhou, Peoples R China
[3] IFLYTEK Res, Hefei, Anhui, Peoples R China
基金
中国国家自然科学基金;
关键词
Group-Level Cognitive Diagnosis; Multi-Task Learning; Attention Mechanism; Data Sparsity; DINA MODEL;
D O I
10.1109/ICDM51629.2021.00031
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most cognitive diagnosis research in education has been concentrated on individual assessment, aiming at discovering the latent characteristics of students. However, in many real-world scenarios, group-level assessment is an important and meaningful task, e.g., class assessment in different regions can discover the difference of teaching level in different contexts. In this work, we consider assessing cognitive ability for a group of students, which aims to mine groups' proficiency on specific knowledge concepts. The significant challenge in this task is the sparsity of group-exercise response data, which seriously affects the assessment performance. Existing works either do not make effective use of additional student-exercise response data or fail to reasonably model the relationship between group ability and individual ability in different learning contexts, resulting in sub-optimal diagnosis results. To this end, we propose a general Multi-Task based Group-Level Cognitive Diagnosis (MGCD) framework, which is featured with three special designs: 1) We jointly model student-exercise responses and group-exercise responses in a multi-task manner to alleviate the sparsity of group-exercise responses; 2) We design a context-aware attention network to model the relationship between student knowledge state and group knowledge state in different contexts; 3) We model an interpretable cognitive layer to obtain student ability, group ability and exercise factors (e.g., difficulty), and then we leverage neural networks to learn complex interaction functions among them. Extensive experiments on real-world datasets demonstrate the generality of MGCD and the effectiveness of our attention design and multi-task learning.
引用
收藏
页码:210 / 219
页数:10
相关论文
共 50 条
  • [1] Joint Prediction of Group-Level Emotion and Cohesiveness with Multi-Task Loss
    Zou, Bochao
    Lin, Zhifeng
    Wang, Haoyi
    Wang, Yingxue
    Lyu, Xiangwen
    Xie, Haiyong
    2020 5TH INTERNATIONAL CONFERENCE ON MATHEMATICS AND ARTIFICIAL INTELLIGENCE (ICMAI 2020), 2020, : 24 - 28
  • [2] Group-level spatio-temporal pattern recovery in MEG decoding using multi-task joint feature learning
    Kia, Seyed Mostafa
    Pedregosa, Fabian
    Blumenthal, Anna
    Passerini, Andrea
    JOURNAL OF NEUROSCIENCE METHODS, 2017, 285 : 97 - 108
  • [3] Learning Multi-Level Task Groups in Multi-Task Learning
    Han, Lei
    Zhang, Yu
    PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2638 - 2644
  • [4] Latent Group Structured Multi-task Learning
    Niu, Xiangyu
    Sun, Yifan
    Sun, Jinyuan
    2018 CONFERENCE RECORD OF 52ND ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2018, : 850 - 854
  • [5] A cognitive diagnosis model: Similarity group-level decision model
    Cai, Yan
    Tu, Dongbo
    Ding, Shuliang
    INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2012, 47 : 2 - 2
  • [6] Multi-Task Networks With Universe, Group, and Task Feature Learning
    Pentyala, Shiva
    Liu, Mengwen
    Dreyer, Markus
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 820 - 830
  • [7] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [8] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [9] Revisiting Scalarization in Multi-Task Learning: A Theoretical Perspective
    Hu, Yuzheng
    Xian, Ruicheng
    Wu, Qilong
    Fan, Qiuling
    Yin, Lang
    Zhao, Han
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [10] Efficient Group Learning with Hypergraph Partition in Multi-task Learning
    Yao, Quanming
    Jiang, Xiubao
    Gong, Mingming
    You, Xinge
    Liu, Yu
    Xu, Duanquan
    PATTERN RECOGNITION, 2012, 321 : 9 - 16