On the Effectiveness of Self-Training in MOOC Dropout Prediction

被引:18
|
作者
Goel, Yamini [1 ]
Goyal, Rinkaj [1 ]
机构
[1] Guru Gobind Singh GGS Indraprastha Univ, Univ Sch Informat Commun & Technol, New Delhi 110078, India
关键词
Semi-Supervised Learning; Deep Learning; Self-Training; MOOCs; Dropout Prediction; ONLINE; QUALITY;
D O I
10.1515/comp-2020-0153
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Massive open online courses (MOOCs) have gained enormous popularity in recent years and have attracted learners worldwide. However, MOOCs face a crucial challenge in the high dropout rate, which varies between 91%-93%. An interplay between different learning analytics strategies and MOOCs have emerged as a research area to reduce dropout rate. Most existing studies use click-stream features as engagement patterns to predict at-risk students. However, this study uses a combination of click-stream features and the influence of the learner's friends based on their demographics to identify potential dropouts. Existing predictive models are based on supervised learning techniques that require the bulk of hand-labelled data to train models. In practice, however, scarcity of massive labelled data makes training difficult. Therefore, this study uses self-training, a semi-supervised learning model, to develop predictive models. Experimental results on a public data set demonstrate that semisupervised models attain comparable results to state-ofthe-art approaches, while also having the flexibility of utilizing a small quantity of labelled data. This study deploys seven well-known optimizers to train the self-training classifiers, out of which, Stochastic Gradient Descent (SGD) outperformed others with the value of F1 score at 94.29%, affirming the relevance of this exposition.
引用
收藏
页码:246 / 258
页数:13
相关论文
共 50 条
  • [21] Doubly Robust Self-Training
    Zhu, Banghua
    Ding, Mingyu
    Jacobson, Philip
    Wu, Ming
    Zhan, Wei
    Jordan, Michael I.
    Jiao, Jiantao
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [22] Deep Bayesian Self-Training
    Ribeiro, Fabio De Sousa
    Caliva, Francesco
    Swainson, Mark
    Gudmundsson, Kjartan
    Leontidis, Georgios
    Kollias, Stefanos
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (09): : 4275 - 4291
  • [23] RECURSIVE SELF-TRAINING ALGORITHMS
    TSYPKIN, YZ
    KELMANS, GK
    ENGINEERING CYBERNETICS, 1967, (05): : 70 - &
  • [24] EDPS: Early Dropout Prediction System of MOOC Courses
    Zhang, Jiaxuan
    Ma, Kun
    2022 29TH ASIA-PACIFIC SOFTWARE ENGINEERING CONFERENCE, APSEC, 2022, : 562 - 563
  • [25] Rethinking Pre-training and Self-training
    Zoph, Barret
    Ghiasi, Golnaz
    Lin, Tsung-Yi
    Cui, Yin
    Liu, Hanxiao
    Cubuk, Ekin D.
    Le, Quoc V.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [26] Self-Training Statistical Quality Prediction of Batch Processes with Limited Quality Data
    Ge, Zhiqiang
    Song, Zhihuan
    Gao, Furong
    INDUSTRIAL & ENGINEERING CHEMISTRY RESEARCH, 2013, 52 (02) : 979 - 984
  • [27] An Evaluation of Self-training Styles for Domain Adaptation on the Task of Splice Site Prediction
    Herndon, Nic
    Caragea, Doina
    PROCEEDINGS OF THE 2015 IEEE/ACM INTERNATIONAL CONFERENCE ON ADVANCES IN SOCIAL NETWORKS ANALYSIS AND MINING (ASONAM 2015), 2015, : 1042 - 1047
  • [28] Low-Resource Mandarin Prosodic Structure Prediction Using Self-Training
    Wang, Xingrui
    Zhang, Bowen
    Shinozaki, Takahiro
    2021 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2021, : 859 - 863
  • [29] SELF-BLM: Prediction of drug-target interactions via self-training SVM
    Keum, Jongsoo
    Nam, Hojung
    PLOS ONE, 2017, 12 (02):
  • [30] Self-Training System of Calligraphy Brushwork
    Morikawa, Ami
    Tsuda, Naoaki
    Nomura, Yoshihiko
    Kato, Norihiko
    COMPANION OF THE 2017 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'17), 2017, : 215 - 216