Sentence-graph-level knowledge injection with multi-task learning

被引:0
|
作者
Chen, Liyi [1 ]
Wang, Runze [2 ]
Shi, Chen [2 ]
Yuan, Yifei [3 ]
Liu, Jie [1 ]
Hu, Yuxiang [2 ]
Jiang, Feijun [2 ]
机构
[1] Nankai Univ, Coll Artificial Intelligence, Tianjin, Peoples R China
[2] Alibaba Grp, Hangzhou, Peoples R China
[3] Chinese Univ Hong Kong, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
Language representation learning; Knowledge graph; Knowledge injection; Multi-task learning;
D O I
10.1007/s11280-025-01329-z
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Language representation learning is a fundamental task for natural language understanding. It aims to represent natural language sentences and classify their mentioned entities and relations, which usually requires injecting external entity and relation knowledge into sentence representation. Existing methods typically inject factual knowledge into pre-trained language models by sequentially concatenating knowledge behind the sentence, with less attention to the structured information from the knowledge graph and the interactive relationship within. In this paper, we learn the sentence representation based on both Sentence- and Graph- level knowledge at the fine-tuning stage with a multi-task learning framework (SenGraph). At sentence-level, we concatenate factual knowledge with the sentence by a sequential structure, and train it with a sentence-level task. At the graph-level, we construct all the knowledge and sentence information as a graph, and introduce a relational GAT to inject useful knowledge into sentences selectively. Meanwhile, we design two graph-based auxiliary tasks to align the heterogeneous embedding space between the natural language sentence and the knowledge graph. We evaluate our model on four knowledge-driven benchmark datasets. The experimental results demonstrate the effectiveness of the proposed method using less computational resources.
引用
收藏
页数:20
相关论文
共 50 条
  • [31] Open knowledge base canonicalization with multi-task learning
    Liu, Bingchen
    Peng, Huang
    Zeng, Weixin
    Zhao, Xiang
    Liu, Shijun
    Pan, Li
    Li, Xin
    WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2024, 27 (05):
  • [32] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [33] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [34] Multi-Task Learning with Knowledge Distillation for Dense Prediction
    Xu, Yangyang
    Yang, Yibo
    Zhang, Lefei
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 21493 - 21502
  • [35] Knowledge triple mining via multi-task learning
    Zhang, Zhao
    Zhuang, Fuzhen
    Li, Xuebing
    Niu, Zheng-Yu
    He, Jia
    He, Qing
    Xiong, Hui
    INFORMATION SYSTEMS, 2019, 80 : 64 - 75
  • [36] Multi-task heterogeneous graph learning on electronic health records
    Chan, Tsai Hor
    Yin, Guosheng
    Bae, Kyongtae
    Yu, Lequan
    NEURAL NETWORKS, 2024, 180
  • [37] Association Graph Learning for Multi-Task Classification with Category Shifts
    Shen, Jiayi
    Xiao, Zehao
    Zhen, Xiantong
    Snoek, Cees G. M.
    Worring, Marcel
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [38] A Multi-Task Representation Learning Architecture for Enhanced Graph Classification
    Xie, Yu
    Gong, Maoguo
    Gao, Yuan
    Qin, A. K.
    Fan, Xiaolong
    FRONTIERS IN NEUROSCIENCE, 2020, 13
  • [39] Adaptive dual graph regularization for clustered multi-task learning
    Liu, Cheng
    Li, Rui
    Chen, Sentao
    Zheng, Lin
    Jiang, Dazhi
    NEUROCOMPUTING, 2024, 574
  • [40] Usr-mtl: an unsupervised sentence representation learning framework with multi-task learning
    Wenshen Xu
    Shuangyin Li
    Yonghe Lu
    Applied Intelligence, 2021, 51 : 3506 - 3521