Multi-task learning approach for utilizing temporal relations in natural language understanding tasks

被引:0
|
作者
Lim, Chae-Gyun [1 ]
Jeong, Young-Seob [2 ]
Choi, Ho-Jin [1 ]
机构
[1] Korea Adv Inst Sci & Technol, Sch Comp, Daejeon 34141, South Korea
[2] Chungbuk Natl Univ, Dept Comp Engn, Cheongju 28644, South Korea
关键词
D O I
10.1038/s41598-023-35009-7
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Various studies have been conducted on multi-task learning techniques in natural language understanding (NLU), which build a model capable of processing multiple tasks and providing generalized performance. Most documents written in natural languages contain time-related information. It is essential to recognize such information accurately and utilize it to understand the context and overall content of a document while performing NLU tasks. In this study, we propose a multi-task learning technique that includes a temporal relation extraction task in the training process of NLU tasks such that the trained model can utilize temporal context information from the input sentences. To utilize the characteristics of multi-task learning, an additional task that extracts temporal relations from given sentences was designed, and the multi-task model was configured to learn in combination with the existing NLU tasks on Korean and English datasets. Performance differences were analyzed by combining NLU tasks to extract temporal relations. The accuracy of the single task for temporal relation extraction is 57.8 and 45.1 for Korean and English, respectively, and improves up to 64.2 and 48.7 when combined with other NLU tasks. The experimental results confirm that extracting temporal relations can improve its performance when combined with other NLU tasks in multi-task learning, compared to dealing with it individually. Also, because of the differences in linguistic characteristics between Korean and English, there are different task combinations that positively affect extracting the temporal relations.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Multi-task learning approach for utilizing temporal relations in natural language understanding tasks
    Chae-Gyun Lim
    Young-Seob Jeong
    Ho-Jin Choi
    Scientific Reports, 13
  • [2] Bidirectional Transformer Based Multi-Task Learning for Natural Language Understanding
    Tripathi, Suraj
    Singh, Chirag
    Kumar, Abhay
    Pandey, Chandan
    Jain, Nishant
    NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS (NLDB 2019), 2019, 11608 : 54 - 65
  • [3] Multi-Task Learning in Natural Language Processing: An Overview
    Chen, Shijie
    Zhang, Yu
    Yang, Qiang
    ACM COMPUTING SURVEYS, 2024, 56 (12)
  • [4] Multi-Task Deep Neural Networks for Natural Language Understanding
    Liu, Xiaodong
    He, Pengcheng
    Chen, Weizhu
    Gao, Jianfeng
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 4487 - 4496
  • [5] MTL-SLT: Multi-Task Learning for Spoken Language Tasks
    Huang, Zhiqi
    Rao, Milind
    Raju, Anirudh
    Zhang, Zhe
    Bui, Bach
    Lee, Chul
    PROCEEDINGS OF THE 4TH WORKSHOP ON NLP FOR CONVERSATIONAL AI, 2022, : 120 - 130
  • [6] Multi-Task Learning for Spoken Language Understanding with Shared Slots
    Li, Xiao
    Wang, Ye-Yi
    Tur, Gokhan
    12TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2011 (INTERSPEECH 2011), VOLS 1-5, 2011, : 708 - +
  • [7] A JOINT MULTI-TASK LEARNING FRAMEWORK FOR SPOKEN LANGUAGE UNDERSTANDING
    Li, Changliang
    Kong, Cunliang
    Zhao, Yan
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 6054 - 6058
  • [8] Learning Sparse Task Relations in Multi-Task Learning
    Zhang, Yu
    Yang, Qiang
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2914 - 2920
  • [9] Hierarchical and Bidirectional Joint Multi-Task Classifiers for Natural Language Understanding
    Ji, Xiaoyu
    Hu, Wanyang
    Liang, Yanyan
    MATHEMATICS, 2023, 11 (24)
  • [10] A Sequential and Intensive Weighted Language Modeling Scheme for Multi-Task Learning-Based Natural Language Understanding
    Son, Suhyune
    Hwang, Seonjeong
    Bae, Sohyeun
    Park, Soo Jun
    Choi, Jang-Hwan
    APPLIED SCIENCES-BASEL, 2021, 11 (07):