MTLAN: Multi-Task Learning and Auxiliary Network for Enhanced Sentence Embedding

被引:0
|
作者
Liu, Gang [1 ,2 ]
Wang, Tongli [1 ]
Yang, Wenli [1 ]
Yan, Zhizheng [1 ]
Zhan, Kai [3 ]
机构
[1] Harbin Engn Univ, Coll Comp Sci & Technol, Harbin, Peoples R China
[2] Harbin Engn Univ, Modeling & Emulat E Govt Natl Engn Lab, Harbin, Peoples R China
[3] PwC Enterprise Digital, PricewaterhouseCoopers, Sydney, NSW, Australia
关键词
Cross-lingual; Sentence embedding; Multi-task learning; Contrastive learning; Auxiliary network;
D O I
10.1007/978-981-99-8067-3_2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The objective of cross-lingual sentence embedding learning is to map sentences into a shared representation space, where semantically similar sentence representations are closer together, while distinct sentence representations exhibit clear differentiation. This paper proposes a novel sentence embedding model called MTLAN, which incorporates multi-task learning and auxiliary networks. The model utilizes the LaBSE model for extracting sentence features and undergoes joint training on tasks related to sentence semantic representation and distance measurement. Furthermore, an auxiliary network is employed to enhance the contextual expression of words within sentences. To address the issue of limited resources for low-resource languages, we construct a pseudocorpus dataset using a multilingual dictionary for unsupervised learning. We conduct experiments on multiple publicly available datasets, including STS and SICK, to evaluate both monolingual sentence similarity and cross-lingual semantic similarity. The empirical results demonstrate the significant superiority of our proposed model over state-of-the-art methods.
引用
收藏
页码:16 / 27
页数:12
相关论文
共 50 条
  • [31] Sample-level weighting for multi-task learning with auxiliary tasks
    Gregoire, Emilie
    Chaudhary, Muhammad Hafeez
    Verboven, Sam
    APPLIED INTELLIGENCE, 2024, 54 (04) : 3482 - 3501
  • [32] A novel embedding learning framework for relation completion and recommendation based on graph neural network and multi-task learning
    Zhao, Wenbin
    Li, Yahui
    Fan, Tongrang
    Wu, Feng
    SOFT COMPUTING, 2022, 28 (Suppl 2) : 447 - 447
  • [33] Sample-level weighting for multi-task learning with auxiliary tasks
    Emilie Grégoire
    Muhammad Hafeez Chaudhary
    Sam Verboven
    Applied Intelligence, 2024, 54 : 3482 - 3501
  • [34] Multi-task Neural Network for Robust Multiple Speaker Embedding Extraction
    He, Weipeng
    Motlicek, Petr
    Odobez, Jean-Marc
    INTERSPEECH 2021, 2021, : 506 - 510
  • [35] Enhanced representation and multi-task learning for image annotation
    Binder, Alexander
    Samek, Wojciech
    Mueller, Klaus-Robert
    Kawanabe, Motoaki
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2013, 117 (05) : 466 - 478
  • [36] Usr-mtl: an unsupervised sentence representation learning framework with multi-task learning
    Wenshen Xu
    Shuangyin Li
    Yonghe Lu
    Applied Intelligence, 2021, 51 : 3506 - 3521
  • [37] Usr-mtl: an unsupervised sentence representation learning framework with multi-task learning
    Xu, Wenshen
    Li, Shuangyin
    Lu, Yonghe
    APPLIED INTELLIGENCE, 2021, 51 (06) : 3506 - 3521
  • [38] A Dual-branch Enhanced Multi-task Learning Network for Multimodal Sentiment Analysis
    Geng, Wenxiu
    Li, Xiangxian
    Bian, Yulong
    PROCEEDINGS OF THE 2023 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2023, 2023, : 481 - 489
  • [39] A Multi-task Text Classification Model Based on Label Embedding Learning
    Xu, Yuemei
    Fan, Zuwei
    Cao, Han
    CYBER SECURITY, CNCERT 2021, 2022, 1506 : 211 - 225
  • [40] Enhanced task attention with adversarial learning for dynamic multi-task CNN
    School of Computer Engineering and Science, Shanghai University, China
    不详
    不详
    不详
    Pattern Recogn.,