Multi-Task Learning of Japanese How-to Tip Machine Reading Comprehension by a Generative Model

被引:0
|
作者
Wang, Xiaotian [1 ]
Li, Tingxuan [1 ]
Tamura, Takuya [1 ]
Nishida, Shunsuke [1 ]
Utsuro, Takehito [1 ]
机构
[1] Univ Tsukuba, Grad Sch Sci & Technol, Tsukuba 3058673, Japan
关键词
multi-task learning; QA task; machine reading comprehension; generative model; how-to tip;
D O I
10.1587/transinf.2023EDP7113
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the research of machine reading comprehension of Japanese how-to tip QA tasks, conventional extractive machine reading comprehension methods have difficulty in dealing with cases in which the answer string spans multiple locations in the context. The method of finetuning of the BERT model for machine reading comprehension tasks is not suitable for such cases. In this paper, we trained a generative machine reading comprehension model of Japanese how-to tip by constructing a generative dataset based on the website "wikihow" as a source of information. We then proposed two methods for multi-task learning to fine-tune the generative model. The first method is the multi-task learning with a generative and extractive hybrid training dataset, where both generative and extractive datasets are simultaneously trained on a single model. The second method is the multi-task learning with the inter-sentence semantic similarity and answer generation, where, drawing upon the answer generation task, the model additionally learns the distance between the sentences of the question/context and the answer in the training examples. The evaluation results showed that both of the multi-task learning methods significantly outperformed the single-task learning method in generative question-and-answer examples. Between the two methods for multi-task learning, that with the intersentence semantic similarity and answer generation performed the best in terms of the manual evaluation result. The data and the code are available at https://github.com/EternalEdenn/multitask_ext-gen_sts-gen.
引用
收藏
页码:125 / 134
页数:10
相关论文
共 50 条
  • [21] Generative Multi-Task Learning for Text Classification
    Zhao, Wei
    Gao, Hui
    Chen, Shuhui
    Wang, Nan
    IEEE ACCESS, 2020, 8 : 86380 - 86387
  • [22] Transformation of Discriminative Single-Task Classification into Generative Multi-Task Classification in Machine Learning Context
    Liu, Han
    Cocea, Mihaela
    Mohasseh, Alaa
    Bader, Mohamed
    2017 NINTH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2017, : 66 - 73
  • [23] Multi-Task Deep Neural Networks for Multi-Document Reading Comprehension
    Liu, Chang
    Liu, Zhuang
    Lin, Wayne
    Zhao, Jun
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [24] THE TEACHING OF READING-COMPREHENSION ACCORDING TO THE MODEL OF GENERATIVE LEARNING
    LINDEN, M
    WITTROCK, MC
    READING RESEARCH QUARTERLY, 1981, 17 (01) : 44 - 57
  • [25] Evolutionary Multi-task Learning for Modular Extremal Learning Machine
    Tang, Zedong
    Gong, Maoguo
    Zhang, Mingyang
    2017 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2017, : 474 - 479
  • [26] RESEARCH OF MULTI-TASK LEARNING BASED ON EXTREME LEARNING MACHINE
    Mao, Wentao
    Xu, Jiucheng
    Zhao, Shengjie
    Tian, Mei
    INTERNATIONAL JOURNAL OF UNCERTAINTY FUZZINESS AND KNOWLEDGE-BASED SYSTEMS, 2013, 21 : 75 - 85
  • [27] Multi-task Learning for Multilingual Neural Machine Translation
    Wang, Yiren
    Zhai, ChengXiang
    Awadalla, Hany Hassan
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1022 - 1034
  • [28] Training Flexible Depth Model by Multi-Task Learning for Neural Machine Translation
    Wang, Qiang
    Xiao, Tong
    Zhu, Jingbo
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 4307 - 4312
  • [29] Machine Reading Comprehension Model in Domain- Transfer Task
    Rozhkov, I. S.
    Loukachevitch, N. V.
    LOBACHEVSKII JOURNAL OF MATHEMATICS, 2023, 44 (08) : 3160 - 3168
  • [30] Machine Reading Comprehension Model in Domain-Transfer Task
    I. S. Rozhkov
    N. V. Loukachevitch
    Lobachevskii Journal of Mathematics, 2023, 44 : 3160 - 3168