Ensemble-NQG-T5: Ensemble Neural Question Generation Model Based on Text-to-Text Transfer Transformer

被引:6
|
作者
Hwang, Myeong-Ha [1 ]
Shin, Jikang [1 ]
Seo, Hojin [1 ]
Im, Jeong-Seon [1 ]
Cho, Hee [1 ]
Lee, Chun-Kwon [2 ]
机构
[1] Korea Elect Power Res Inst KEPRI, Digital Solut Lab, 105 Munji Ro, Daejeon 34056, South Korea
[2] Pukyong Natl Univ, Dept Control & Instrumentat Engn, Busan 48513, South Korea
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 02期
关键词
neural question generation; deep learning; natural language processing; ensemble algorithm;
D O I
10.3390/app13020903
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Deep learning chatbot research and development is exploding recently to offer customers in numerous industries personalized services. However, human resources are used to create a learning dataset for a deep learning chatbot. In order to augment this, the idea of neural question generation (NQG) has evolved, although it has restrictions on how questions can be expressed in different ways and has a finite capacity for question generation. In this paper, we propose an ensemble-type NQG model based on the text-to-text transfer transformer (T5). Through the proposed model, the number of generated questions for each single NQG model can be greatly increased by considering the mutual similarity and the quality of the questions using the soft-voting method. For the training of the soft-voting algorithm, the evaluation score and mutual similarity score weights based on the context and the question-answer (QA) dataset are used as the threshold weight. Performance comparison results with existing T5-based NQG models using the SQuAD 2.0 dataset demonstrate the effectiveness of the proposed method for QG. The implementation of the proposed ensemble model is anticipated to span diverse industrial fields, including interactive chatbots, robotic process automation (RPA), and Internet of Things (IoT) services in the future.
引用
收藏
页数:12
相关论文
共 43 条
  • [31] TST-GAN: A Legal Document Generation Model Based on Text Style Transfer
    Li, Xiaolin
    Huang, Lei
    Zhou, Yifan
    Shao, Changcheng
    2021 4TH INTERNATIONAL CONFERENCE ON ROBOTICS, CONTROL AND AUTOMATION ENGINEERING (RCAE 2021), 2021, : 90 - 93
  • [32] A Transformer-Based Hierarchical Variational AutoEncoder Combined Hidden Markov Model for Long Text Generation
    Zhao, Kun
    Ding, Hongwei
    Ye, Kai
    Cui, Xiaohui
    ENTROPY, 2021, 23 (10)
  • [33] Chinese Text Error Correction Based on PE-T5 Model
    Deng, Hua
    Xu, Kang
    Li, Rongsheng
    Qi, Yifei
    2024 5TH INTERNATIONAL CONFERENCE ON COMPUTING, NETWORKS AND INTERNET OF THINGS, CNIOT 2024, 2024, : 223 - 227
  • [34] A Text-Driven Aircraft Fault Diagnosis Model Based on Word2vec and Stacking Ensemble Learning
    Zhou, Shenghan
    Wei, Chaofan
    Li, Pan
    Liu, Anying
    Chang, Wenbing
    Xiao, Yiyong
    AEROSPACE, 2021, 8 (12)
  • [35] Bidirectional Transformer based on online Text-based information to Implement Convolutional Neural Network Model For Secure Business Investment
    Heidari, Maryam
    Rafatirad, Setareh
    PROCEEDINGS OF THE 2020 IEEE INTERNATIONAL SYMPOSIUM ON TECHNOLOGY AND SOCIETY (ISTAS), 2021, : 322 - 329
  • [36] A transformer-Based neural language model that synthesizes brain activation maps from free-form text queries
    Ngo, Gia H.
    Nguyen, Minh
    Chen, Nancy F.
    Sabuncu, Mert R.
    MEDICAL IMAGE ANALYSIS, 2022, 81
  • [37] Model Agnostic Meta-Learning (MAML)-Based Ensemble Model for Accurate Detection of Wheat Diseases Using Vision Transformer and Graph Neural Networks
    Maqsood, Yasir
    Usman, Syed Muhammad
    Alhussein, Musaed
    Aurangzeb, Khursheed
    Khalid, Shehzad
    Zubair, Muhammad
    CMC-COMPUTERS MATERIALS & CONTINUA, 2024, 79 (02): : 2795 - 2811
  • [38] T2TD: Text-3D Generation Model Based on Prior Knowledge Guidance
    Nie, Weizhi
    Chen, Ruidong
    Wang, Weijie
    Lepri, Bruno
    Sebe, Nicu
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2025, 47 (01) : 172 - 189
  • [39] IGSQL: Database Schema Interaction Graph Based Neural Model for Context-Dependent Text-to-SQL Generation
    Cai, Yitao
    Wan, Xiaojun
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 6903 - 6912
  • [40] Automatic Summary Generation of Chinese News Text Based on Cross-Layer Parameter Shared Transformer-Encoder-PGN Model
    Wu, Yuan
    Liu, Changhui
    Liu, Yangwenhao
    2021 4TH INTERNATIONAL CONFERENCE ON ROBOTICS, CONTROL AND AUTOMATION ENGINEERING (RCAE 2021), 2021, : 94 - 98