Multi-Task Learning of Japanese How-to Tip Machine Reading Comprehension by a Generative Model

被引:0
|
作者
Wang, Xiaotian [1 ]
Li, Tingxuan [1 ]
Tamura, Takuya [1 ]
Nishida, Shunsuke [1 ]
Utsuro, Takehito [1 ]
机构
[1] Univ Tsukuba, Grad Sch Sci & Technol, Tsukuba 3058673, Japan
关键词
multi-task learning; QA task; machine reading comprehension; generative model; how-to tip;
D O I
10.1587/transinf.2023EDP7113
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In the research of machine reading comprehension of Japanese how-to tip QA tasks, conventional extractive machine reading comprehension methods have difficulty in dealing with cases in which the answer string spans multiple locations in the context. The method of finetuning of the BERT model for machine reading comprehension tasks is not suitable for such cases. In this paper, we trained a generative machine reading comprehension model of Japanese how-to tip by constructing a generative dataset based on the website "wikihow" as a source of information. We then proposed two methods for multi-task learning to fine-tune the generative model. The first method is the multi-task learning with a generative and extractive hybrid training dataset, where both generative and extractive datasets are simultaneously trained on a single model. The second method is the multi-task learning with the inter-sentence semantic similarity and answer generation, where, drawing upon the answer generation task, the model additionally learns the distance between the sentences of the question/context and the answer in the training examples. The evaluation results showed that both of the multi-task learning methods significantly outperformed the single-task learning method in generative question-and-answer examples. Between the two methods for multi-task learning, that with the intersentence semantic similarity and answer generation performed the best in terms of the manual evaluation result. The data and the code are available at https://github.com/EternalEdenn/multitask_ext-gen_sts-gen.
引用
收藏
页码:125 / 134
页数:10
相关论文
共 50 条
  • [41] Configured quantum reservoir computing for multi-task machine learning
    Xia, Wei
    Zou, Jie
    Qiu, Xingze
    Chen, Feng
    Zhu, Bing
    Li, Chunhe
    Deng, Dong-Ling
    Li, Xiaopeng
    SCIENCE BULLETIN, 2023, 68 (20) : 2321 - 2329
  • [42] Improving Robustness of Neural Machine Translation with Multi-task Learning
    Zhou, Shuyan
    Zeng, Xiangkai
    Zhou, Yingqi
    Anastasopoulos, Antonios
    Neubig, Graham
    FOURTH CONFERENCE ON MACHINE TRANSLATION (WMT 2019), 2019, : 565 - 571
  • [43] Multi-Task Learning of Query Generation and Classification for Generative Conversational Question Rewriting
    Kongyoung, Sarawoot
    Macdonald, Craig
    Ounis, Iadh
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 13667 - 13678
  • [44] Hierarchical Gaussian Processes model for multi-task learning
    Li, Ping
    Chen, Songcan
    PATTERN RECOGNITION, 2018, 74 : 134 - 144
  • [45] Unsupervised learning of multi-task deep variational model
    Tan, Lu
    Li, Ling
    Liu, Wan-Quan
    An, Sen-Jian
    Munyard, Kylie
    JOURNAL OF VISUAL COMMUNICATION AND IMAGE REPRESENTATION, 2022, 87
  • [46] MULTI-TASK LEARNING WITH LOCALIZED GENERALIZATION ERROR MODEL
    Li, Wendi
    Zhu, Yi
    Wang, Ting
    Ng, Wing W. Y.
    PROCEEDINGS OF 2019 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS (ICMLC), 2019, : 380 - 387
  • [47] A Regression Model Tree Algorithm by Multi-task Learning
    Jo, Seeun
    Jun, Chi-Hyuck
    INDUSTRIAL ENGINEERING AND MANAGEMENT SYSTEMS, 2021, 20 (02): : 163 - 171
  • [48] Multi-Task Learning Model for Kazakh Query Understanding
    Haisa, Gulizada
    Altenbek, Gulila
    SENSORS, 2022, 22 (24)
  • [49] Task-Aware Dynamic Model Optimization for Multi-Task Learning
    Choi, Sujin
    Jin, Hyundong
    Kim, Eunwoo
    IEEE ACCESS, 2023, 11 : 137709 - 137717
  • [50] Multi-task learning とmulti-stream monocular depth estimation using integrated model with multi-task learning and multi-stream
    Takamine, Michiru
    Endo, Satoshi
    Transactions of the Japanese Society for Artificial Intelligence, 2021, 36 (05): : 1 - 9