Ensemble-NQG-T5: Ensemble Neural Question Generation Model Based on Text-to-Text Transfer Transformer

被引:6
|
作者
Hwang, Myeong-Ha [1 ]
Shin, Jikang [1 ]
Seo, Hojin [1 ]
Im, Jeong-Seon [1 ]
Cho, Hee [1 ]
Lee, Chun-Kwon [2 ]
机构
[1] Korea Elect Power Res Inst KEPRI, Digital Solut Lab, 105 Munji Ro, Daejeon 34056, South Korea
[2] Pukyong Natl Univ, Dept Control & Instrumentat Engn, Busan 48513, South Korea
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 02期
关键词
neural question generation; deep learning; natural language processing; ensemble algorithm;
D O I
10.3390/app13020903
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Deep learning chatbot research and development is exploding recently to offer customers in numerous industries personalized services. However, human resources are used to create a learning dataset for a deep learning chatbot. In order to augment this, the idea of neural question generation (NQG) has evolved, although it has restrictions on how questions can be expressed in different ways and has a finite capacity for question generation. In this paper, we propose an ensemble-type NQG model based on the text-to-text transfer transformer (T5). Through the proposed model, the number of generated questions for each single NQG model can be greatly increased by considering the mutual similarity and the quality of the questions using the soft-voting method. For the training of the soft-voting algorithm, the evaluation score and mutual similarity score weights based on the context and the question-answer (QA) dataset are used as the threshold weight. Performance comparison results with existing T5-based NQG models using the SQuAD 2.0 dataset demonstrate the effectiveness of the proposed method for QG. The implementation of the proposed ensemble model is anticipated to span diverse industrial fields, including interactive chatbots, robotic process automation (RPA), and Internet of Things (IoT) services in the future.
引用
收藏
页数:12
相关论文
共 43 条
  • [1] Enhance Text-to-Text Transfer Transformer with Generated Questions for Thai Question Answering
    Phakmongkol, Puri
    Vateekul, Peerapon
    APPLIED SCIENCES-BASEL, 2021, 11 (21):
  • [2] Text-to-Text Transfer Transformer Phrasing Model Using Enriched Text Input
    Rezackova, Marketa
    Matousek, Jindrich
    TEXT, SPEECH, AND DIALOGUE (TSD 2022), 2022, 13502 : 389 - 400
  • [3] ViT5: Pretrained Text-to-Text Transformer for Vietnamese Language Generation
    Long Phan
    Hieu Tran
    Hieu Nguyen
    Trinh, Trieu H.
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES: PROCEEDINGS OF THE STUDENT RESEARCH WORKSHOP, 2022, : 136 - 142
  • [4] T5G2P: Text-to-Text Transfer Transformer Based Grapheme-to-Phoneme Conversion
    Rezackova, Marketa
    Tihelka, Daniel
    Matousek, Jindrich
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 3466 - 3476
  • [5] HaT5: Hate Language Identification using Text-to-Text Transfer Transformer
    Sabry, Sana Sabah
    Adewumi, Tosin
    Abid, Nosheen
    Kovacs, Gyorgy
    Liwicki, Foteini
    Liwicki, Marcus
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [6] Math Word Problem Solver Based on Text-to-Text Transformer Model
    Yang, Chuanzhi
    Huang, Runze
    Yu, Xinguo
    Peng, Rao
    IEEE TALE2021: IEEE INTERNATIONAL CONFERENCE ON ENGINEERING, TECHNOLOGY AND EDUCATION, 2021, : 818 - 822
  • [7] KAT5: Knowledge-Aware Transfer Learning with a Text-to-Text Transfer Transformer
    Sohrab, Mohammad Golam
    Miwa, Makoto
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-APPLIED DATA SCIENCE TRACK, PT IX, ECML PKDD 2024, 2024, 14949 : 157 - 173
  • [8] T5G2P: Using Text-to-Text Transfer Transformer for Grapheme-to-Phoneme Conversion
    Rezackova, Marketa
    Svec, Jan
    Tihelka, Daniel
    INTERSPEECH 2021, 2021, : 6 - 10
  • [9] End-to-End generation of Multiple-Choice questions using Text-to-Text transfer Transformer models
    Rodriguez-Torrealba, Ricardo
    Garcia-Lopez, Eva
    Garcia-Cabot, Antonio
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 208
  • [10] Text-To-Text Transfer Transformer Based Method for Generating Startup Scenarios for New Equipment in Power Grids
    Tao, Wenbiao
    Wang, Liang
    Meng, Qingmeng
    Li, Rui
    Han, Peng
    Shi, Yuxin
    Shan, Lianfei
    Geng, Xiaofei
    APPLIED ARTIFICIAL INTELLIGENCE, 2024, 38 (01)