Sentence-graph-level knowledge injection with multi-task learning

被引:0
|
作者
Chen, Liyi [1 ]
Wang, Runze [2 ]
Shi, Chen [2 ]
Yuan, Yifei [3 ]
Liu, Jie [1 ]
Hu, Yuxiang [2 ]
Jiang, Feijun [2 ]
机构
[1] Nankai Univ, Coll Artificial Intelligence, Tianjin, Peoples R China
[2] Alibaba Grp, Hangzhou, Peoples R China
[3] Chinese Univ Hong Kong, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Language representation learning; Knowledge graph; Knowledge injection; Multi-task learning;
D O I
10.1007/s11280-025-01329-z
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Language representation learning is a fundamental task for natural language understanding. It aims to represent natural language sentences and classify their mentioned entities and relations, which usually requires injecting external entity and relation knowledge into sentence representation. Existing methods typically inject factual knowledge into pre-trained language models by sequentially concatenating knowledge behind the sentence, with less attention to the structured information from the knowledge graph and the interactive relationship within. In this paper, we learn the sentence representation based on both Sentence- and Graph- level knowledge at the fine-tuning stage with a multi-task learning framework (SenGraph). At sentence-level, we concatenate factual knowledge with the sentence by a sequential structure, and train it with a sentence-level task. At the graph-level, we construct all the knowledge and sentence information as a graph, and introduce a relational GAT to inject useful knowledge into sentences selectively. Meanwhile, we design two graph-based auxiliary tasks to align the heterogeneous embedding space between the natural language sentence and the knowledge graph. We evaluate our model on four knowledge-driven benchmark datasets. The experimental results demonstrate the effectiveness of the proposed method using less computational resources.
引用
收藏
页数:20
相关论文
共 50 条
  • [21] Encoder augmentation for multi-task graph contrastive learning
    Wang, Xiaoyu
    Zhang, Qiqi
    Liu, Gen
    Zhao, Zhongying
    Cui, Hongzhi
    NEUROCOMPUTING, 2025, 630
  • [22] Convex Graph Laplacian Multi-Task Learning SVM
    Ruiz, Carlos
    Alaiz, Carlos M.
    Dorronsoro, Jose R.
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2020, PT II, 2020, 12397 : 142 - 154
  • [23] Attributed graph clustering with multi-task embedding learning
    Zhang, Xiaotong
    Liu, Han
    Zhang, Xianchao
    Liu, Xinyue
    NEURAL NETWORKS, 2022, 152 : 224 - 233
  • [24] Trust-Aware Multi-Task Knowledge Graph for Recommendation
    Zhou, Yan
    Guo, Jie
    Song, Bin
    Chen, Chen
    Chang, Jianglong
    Yu, Fei Richard
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (08) : 8658 - 8671
  • [25] Multi-task Knowledge Graph Representations via Residual Functions
    Krishnan, Adit
    Das, Mahashweta
    Bendre, Mangesh
    Wang, Fei
    Yang, Hao
    Sundaram, Hari
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2022, PT I, 2022, 13280 : 262 - 275
  • [26] Cross-lingual Sentence Embedding using Multi-Task Learning
    Goswami, Koustava
    Dutta, Sourav
    Assem, Haytham
    Fransen, Theodorus
    McCrae, John P.
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9099 - 9113
  • [27] MTLAN: Multi-Task Learning and Auxiliary Network for Enhanced Sentence Embedding
    Liu, Gang
    Wang, Tongli
    Yang, Wenli
    Yan, Zhizheng
    Zhan, Kai
    NEURAL INFORMATION PROCESSING, ICONIP 2023, PT III, 2024, 14449 : 16 - 27
  • [28] Keeping Consistency of Sentence Generation and Document Classification with Multi-Task Learning
    Nishino, Toru
    Misawa, Shotaro
    Kano, Ryuji
    Taniguchi, Tomoki
    Miura, Yasuhide
    Ohkuma, Tomoko
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 3195 - 3205
  • [29] SEBGM: Sentence Embedding Based on Generation Model with multi-task learning
    Wang, Qian
    Zhang, Weiqi
    Lei, Tianyi
    Cao, Yu
    Peng, Dezhong
    Wang, Xu
    COMPUTER SPEECH AND LANGUAGE, 2024, 87
  • [30] Unified Knowledge-Guided Molecular Graph Encoder with multimodal fusion and multi-task learning
    Chen, Mukun
    Gong, Xiuwen
    Pan, Shirui
    Wu, Jia
    Lin, Fu
    Du, Bo
    Hu, Wenbin
    NEURAL NETWORKS, 2025, 184