Multi-task learning approach for utilizing temporal relations in natural language understanding tasks

被引:0
|
作者
Lim, Chae-Gyun [1 ]
Jeong, Young-Seob [2 ]
Choi, Ho-Jin [1 ]
机构
[1] Korea Adv Inst Sci & Technol, Sch Comp, Daejeon 34141, South Korea
[2] Chungbuk Natl Univ, Dept Comp Engn, Cheongju 28644, South Korea
关键词
D O I
10.1038/s41598-023-35009-7
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Various studies have been conducted on multi-task learning techniques in natural language understanding (NLU), which build a model capable of processing multiple tasks and providing generalized performance. Most documents written in natural languages contain time-related information. It is essential to recognize such information accurately and utilize it to understand the context and overall content of a document while performing NLU tasks. In this study, we propose a multi-task learning technique that includes a temporal relation extraction task in the training process of NLU tasks such that the trained model can utilize temporal context information from the input sentences. To utilize the characteristics of multi-task learning, an additional task that extracts temporal relations from given sentences was designed, and the multi-task model was configured to learn in combination with the existing NLU tasks on Korean and English datasets. Performance differences were analyzed by combining NLU tasks to extract temporal relations. The accuracy of the single task for temporal relation extraction is 57.8 and 45.1 for Korean and English, respectively, and improves up to 64.2 and 48.7 when combined with other NLU tasks. The experimental results confirm that extracting temporal relations can improve its performance when combined with other NLU tasks in multi-task learning, compared to dealing with it individually. Also, because of the differences in linguistic characteristics between Korean and English, there are different task combinations that positively affect extracting the temporal relations.
引用
收藏
页数:10
相关论文
共 50 条
  • [41] Empirical evaluation of multi-task learning in deep neural networks for natural language processing
    Li, Jianquan
    Liu, Xiaokang
    Yin, Wenpeng
    Yang, Min
    Ma, Liqun
    Jin, Yaohong
    NEURAL COMPUTING & APPLICATIONS, 2021, 33 (09): : 4417 - 4428
  • [42] Multi-Task Learning Model for Kazakh Query Understanding
    Haisa, Gulizada
    Altenbek, Gulila
    SENSORS, 2022, 22 (24)
  • [43] A Multi-task Approach to Learning Multilingual Representations
    Singla, Karan
    Can, Dogan
    Narayanan, Shrikanth
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2, 2018, : 214 - 220
  • [44] A Multi-task Learning Approach for Image Captioning
    Zhao, Wei
    Wang, Benyou
    Ye, Jianbo
    Yang, Min
    Zhao, Zhou
    Luo, Ruotian
    Qiao, Yu
    PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 1205 - 1211
  • [45] Multi-Task Learning with Language Modeling for Question Generation
    Zhou, Wenjie
    Zhang, Minghua
    Wu, Yunfang
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 3394 - 3399
  • [46] Variations of multi-task learning for spoken language assessment
    Wong, Jeremy H. M.
    Zhang, Huayun
    Chen, Nancy F.
    INTERSPEECH 2022, 2022, : 4456 - 4460
  • [47] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [48] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [49] Metric-based Regularization and Temporal Ensemble for Multi-task Learning using Heterogeneous Unsupervised Tasks
    Kim, Dae Ha
    Lee, Seung Hyun
    Song, Byung Cheol
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 2903 - 2912
  • [50] A Simple Approach to Balance Task Loss in Multi-Task Learning
    Liang, Sicong
    Deng, Chang
    Zhang, Yu
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 812 - 823