Probing Linguistic Features of Sentence-Level Representations in Neural Relation Extraction

被引:0
|
作者
Alt, Christoph [1 ]
Gabryszak, Aleksandra [1 ]
Hennig, Leonhard [1 ]
机构
[1] German Res Ctr Artificial Intelligence DFKI, Speech & Language Technol Lab, Kaiserslautern, Germany
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Despite the recent progress, little is known about the features captured by state-of-the-art neural relation extraction (RE) models. Common methods encode the source sentence, conditioned on the entity mentions, before classifying the relation. However, the complexity of the task makes it difficult to understand how encoder architecture and supporting linguistic knowledge affect the features learned by the encoder. We introduce 14 probing tasks targeting linguistic properties relevant to RE, and we use them to study representations learned by more than 40 different encoder architecture and linguistic feature combinations trained on two datasets, TACRED and SemEval 2010 Task 8. We find that the bias induced by the architecture and the inclusion of linguistic features are clearly expressed in the probing task performance. For example, adding contextualized word representations greatly increases performance on probing tasks with a focus on named entity and part-of-speech information, and yields better results in RE. In contrast, entity masking improves RE, but considerably lowers performance on entity type related probing tasks.
引用
收藏
页码:1534 / 1545
页数:12
相关论文
共 50 条
  • [1] Neural circuitry underlying sentence-level linguistic prosody
    Tong, YX
    Gandour, J
    Talavage, T
    Wong, D
    Dzemidzic, M
    Xu, YS
    Li, XJ
    Lowe, M
    NEUROIMAGE, 2005, 28 (02) : 417 - 428
  • [2] Word- and Sentence-Level Representations for Implicit Aspect Extraction
    Agathangelou, Pantelis
    Katakis, Ioannis
    Kasnesis, Panagiotis
    IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, 11 (05): : 5935 - 5948
  • [3] Distant Supervised Relation Extraction Based on Sentence-Level Attention with Relation Alignment
    Li, Jing
    Huang, Xingjie
    Gao, Yating
    Liu, Jianyi
    Zhang, Ru
    Zhao, Jinmeng
    ARTIFICIAL INTELLIGENCE AND SECURITY, ICAIS 2022, PT I, 2022, 13338 : 142 - 152
  • [4] Enhancing Targeted Minority Class Prediction in Sentence-Level Relation Extraction
    Baek, Hyeong-Ryeol
    Choi, Yong-Suk
    SENSORS, 2022, 22 (13)
  • [5] Exploiting Linguistic Features for Effective Sentence-Level Sentiment Analysis in Urdu Language
    Amna Altaf
    Muhammad Waqas Anwar
    Muhammad Hasan Jamal
    Usama Ijaz Bajwa
    Multimedia Tools and Applications, 2023, 82 : 41813 - 41839
  • [6] Distant Supervision for Relation Extraction with Sentence-Level Attention and Entity Descriptions
    Ji, Guoliang
    Liu, Kang
    He, Shizhu
    Zhao, Jun
    THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 3060 - 3066
  • [7] Exploiting Linguistic Features for Effective Sentence-Level Sentiment Analysis in Urdu Language
    Altaf, Amna
    Anwar, Muhammad Waqas
    Jamal, Muhammad Hasan
    Bajwa, Usama Ijaz
    MULTIMEDIA TOOLS AND APPLICATIONS, 2023, 82 (27) : 41813 - 41839
  • [8] Neural basis of first and second language processing of sentence-level linguistic prosody
    Gandour, Jackson
    Tong, Yunxia
    Talavage, Thomas
    Wong, Donald
    Dzemidzic, Mario
    Xu, Yisheng
    Li, Xiaojian
    Lowe, Mark
    HUMAN BRAIN MAPPING, 2007, 28 (02) : 94 - 108
  • [9] Learning Sentence-Level Representations with Predictive Coding
    Araujo, Vladimir
    Moens, Marie-Francine
    Soto, Alvaro
    MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2023, 5 (01): : 59 - 77
  • [10] Sentence-level Distant Supervision Relation Extraction based on Dynamic Soft Labels
    Hou, Dejun
    Zhang, Zefeng
    Zhao, Mankun
    Zhang, Wenbin
    Zhao, Yue
    Yu, Jian
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 3194 - 3199