Co-Training Semi-Supervised Deep Learning for Sentiment Classification of MOOC Forum Posts

被引:17
|
作者
Chen, Jing [1 ]
Feng, Jun [1 ,2 ]
Sun, Xia [1 ]
Liu, Yang [1 ]
机构
[1] Northwest Univ, Sch Informat Sci & Technol, Xian 710127, Shaanxi, Peoples R China
[2] Northwest Univ, State Prov Joint Engn & Res Ctr Adv Networking &, Sch Informat Sci & Technol, Xian 710127, Shaanxi, Peoples R China
来源
SYMMETRY-BASEL | 2020年 / 12卷 / 01期
基金
中国国家自然科学基金;
关键词
co-training; semi-supervised learning; sentiment classification; asymmetric data; MOOC;
D O I
10.3390/sym12010008
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Sentiment classification of forum posts of massive open online courses is essential for educators to make interventions and for instructors to improve learning performance. Lacking monitoring on learners' sentiments may lead to high dropout rates of courses. Recently, deep learning has emerged as an outstanding machine learning technique for sentiment classification, which extracts complex features automatically with rich representation capabilities. However, deep neural networks always rely on a large amount of labeled data for supervised training. Constructing large-scale labeled training datasets for sentiment classification is very laborious and time consuming. To address this problem, this paper proposes a co-training, semi-supervised deep learning model for sentiment classification, leveraging limited labeled data and massive unlabeled data simultaneously to achieve performance comparable to those methods trained on massive labeled data. To satisfy the condition of two views of co-training, we encoded texts into vectors from views of word embedding and character-based embedding independently, considering words' external and internal information. To promote the classification performance with limited data, we propose a double-check strategy sample selection method to select samples with high confidence to augment the training set iteratively. In addition, we propose a mixed loss function both considering the labeled data with asymmetric and unlabeled data. Our proposed method achieved a 89.73% average accuracy and an 93.55% average F1-score, about 2.77% and 3.2% higher than baseline methods. Experimental results demonstrate the effectiveness of the proposed model trained on limited labeled data, which performs much better than those trained on massive labeled data.
引用
收藏
页数:24
相关论文
共 50 条
  • [21] Co-training for Semi-supervised Sentiment Classification Based on Dual-view Bags-of-words Representation
    Xia, Rui
    Wang, Cheng
    Dai, Xinyu
    Li, Tao
    PROCEEDINGS OF THE 53RD ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 7TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1, 2015, : 1054 - 1063
  • [22] SEMI-SUPERVISED LEARNING WITH CO-TRAINING FOR DATA-DRIVEN PROGNOSTICS
    Hu, Chao
    Youn, Byeng D.
    Kim, Taejin
    PROCEEDINGS OF THE ASME INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, 2011, VOL 2, PTS A AND B, 2012, : 1297 - 1306
  • [23] Inductive Semi-supervised Multi-Label Learning with Co-Training
    Zhan, Wang
    Zhang, Min-Ling
    KDD'17: PROCEEDINGS OF THE 23RD ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2017, : 1305 - 1314
  • [24] Semi-Supervised Learning of Alternatively Spliced Exons Using Co-Training
    Tangirala, Karthik
    Caragea, Doina
    2011 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE (BIBM 2011), 2011, : 243 - 246
  • [25] Co-training semi-supervised active learning algorithm with noise filter
    School of Computer Science and Telecommunication Engineering, Jiangsu University, Zhenjiang 212013, China
    Moshi Shibie yu Rengong Zhineng, 2009, 5 (750-755):
  • [26] A semi-supervised extreme learning machine method based on co-training
    Li, Kunlun
    Zhang, Juan
    Xu, Hongyu
    Luo, Shangzong
    Li, Hexin
    Journal of Computational Information Systems, 2013, 9 (01): : 207 - 214
  • [27] Stacked co-training for semi-supervised multi-label learning
    Li, Jiaxuan
    Zhu, Xiaoyan
    Wang, Hongrui
    Zhang, Yu
    Wang, Jiayin
    INFORMATION SCIENCES, 2024, 677
  • [28] Development of Co-training Support Vector Machine Model for Semi-supervised Classification
    Chen, Yinghao
    Pan, Tianhong
    Chen, Shan
    PROCEEDINGS OF THE 36TH CHINESE CONTROL CONFERENCE (CCC 2017), 2017, : 11077 - 11080
  • [29] SEMI-SUPERVISED PYRAMID FEATURE CO-TRAINING NETWORK FOR LIDAR DATA CLASSIFICATION
    Wang, Zexin
    Wang, Haoran
    Jiao, Licheng
    Liu, Xu
    2019 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2019), 2019, : 2471 - 2474
  • [30] GCT: Graph Co-Training for Semi-Supervised Few-Shot Learning
    Xu, Rui
    Xing, Lei
    Shao, Shuai
    Zhao, Lifei
    Liu, Baodi
    Liu, Weifeng
    Zhou, Yicong
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2022, 32 (12) : 8674 - 8687