Multi-Task Learning of Japanese How-to Tip Machine Reading Comprehension by a Generative Model

被引:0
|
作者
Wang, Xiaotian [1 ]
Li, Tingxuan [1 ]
Tamura, Takuya [1 ]
Nishida, Shunsuke [1 ]
Utsuro, Takehito [1 ]
机构
[1] Univ Tsukuba, Grad Sch Sci & Technol, Tsukuba 3058673, Japan
关键词
multi-task learning; QA task; machine reading comprehension; generative model; how-to tip;
D O I
10.1587/transinf.2023EDP7113
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the research of machine reading comprehension of Japanese how-to tip QA tasks, conventional extractive machine reading comprehension methods have difficulty in dealing with cases in which the answer string spans multiple locations in the context. The method of finetuning of the BERT model for machine reading comprehension tasks is not suitable for such cases. In this paper, we trained a generative machine reading comprehension model of Japanese how-to tip by constructing a generative dataset based on the website "wikihow" as a source of information. We then proposed two methods for multi-task learning to fine-tune the generative model. The first method is the multi-task learning with a generative and extractive hybrid training dataset, where both generative and extractive datasets are simultaneously trained on a single model. The second method is the multi-task learning with the inter-sentence semantic similarity and answer generation, where, drawing upon the answer generation task, the model additionally learns the distance between the sentences of the question/context and the answer in the training examples. The evaluation results showed that both of the multi-task learning methods significantly outperformed the single-task learning method in generative question-and-answer examples. Between the two methods for multi-task learning, that with the intersentence semantic similarity and answer generation performed the best in terms of the manual evaluation result. The data and the code are available at https://github.com/EternalEdenn/multitask_ext-gen_sts-gen.
引用
收藏
页码:125 / 134
页数:10
相关论文
共 50 条
  • [1] Japanese How-to Tip Machine Reading Comprehension by Multi-task Learning Based on Generative Model
    Wang, Xiaotian
    Li, Tingxuan
    Tamura, Takuya
    Nishida, Shunsuke
    Zhu, Fuzhu
    Utsuro, Takehito
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2023, 14102 LNAI : 3 - 14
  • [2] Japanese How-to Tip Machine Reading Comprehension by Multi-task Learning Based on Generative Model
    Wang, Xiaotian
    Li, Tingxuan
    Tamura, Takuya
    Nishida, Shunsuke
    Zhu, Fuzhu
    Utsuro, Takehito
    TEXT, SPEECH, AND DIALOGUE, TSD 2023, 2023, 14102 : 3 - 14
  • [3] Multi-Task Learning with Generative Adversarial Training for Multi-Passage Machine Reading Comprehension
    Ren, Qiyu
    Cheng, Xiang
    Su, Sen
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 8705 - 8712
  • [4] A Multi-Task Learning Machine Reading Comprehension Model for Noisy Document
    Wu, Zhijing
    Xu, Hua
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 13963 - 13964
  • [5] Multi-task transfer learning for biomedical machine reading comprehension
    Guo, Wenyang
    Du, Yongping
    Zhao, Yiliang
    Ren, Keyan
    INTERNATIONAL JOURNAL OF DATA MINING AND BIOINFORMATICS, 2020, 23 (03) : 234 - 250
  • [6] Multi-task joint training model for machine reading comprehension
    Li, Fangfang
    Shan, Youran
    Mao, Xingliang
    Ren, Xingkai
    Liu, Xiyao
    Zhang, Shichao
    NEUROCOMPUTING, 2022, 488 : 66 - 77
  • [7] Multi-task Learning with Sample Re-weighting for Machine Reading Comprehension
    Xu, Yichong
    Liu, Xiaodong
    Shen, Yelong
    Liu, Jingjing
    Gao, Jianfeng
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 2644 - 2655
  • [8] Improving Machine Reading Comprehension with Multi-Task Learning and Self-Training
    Ouyang, Jianquan
    Fu, Mengen
    MATHEMATICS, 2022, 10 (03)
  • [9] Multi-Passage Machine Reading Comprehension Through Multi-Task Learning and Dual Verification
    Li, Xingyi
    Cheng, Xiang
    Xia, Min
    Ren, Qiyu
    He, Zhaofeng
    Su, Sen
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (10) : 5280 - 5293
  • [10] Named Entity Recognition via Machine Reading Comprehension: A Multi-Task Learning Approach
    Wang, Yibo
    Zhao, Wenting
    Wan, Yao
    Deng, Zhongfen
    Yu, Philip S.
    13TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING AND THE 3RD CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, IJCNLP-AACL 2023, 2023, : 13 - 19