Exploring DBpedia and Wikipedia for Portuguese Semantic Relationship Extraction

被引:0
|
作者
Batista, David S. [1 ,2 ]
Forte, David [1 ,2 ]
Silva, Rui [1 ,2 ]
Martins, Bruno [1 ,2 ]
Silva, Mario J. [1 ,2 ]
机构
[1] Inst Super Tecn, Lisbon, Portugal
[2] INESC ID, Lisbon, Portugal
来源
LINGUAMATICA | 2013年 / 5卷 / 01期
关键词
Relation Extraction; Information Extraction;
D O I
暂无
中图分类号
H0 [语言学];
学科分类号
030303 ; 0501 ; 050102 ;
摘要
The identification of semantic relationships, as expressed between named entities in text, is an important step for extracting knowledge from large document collections, such as the Web. Previous works have addressed this task for the English language through supervised learning techniques for automatic classification. The current state of the art involves the use of learning methods based on string kernels (Kim et al., 2010; Zhao e Grishman, 2005). However, such approaches require manually annotated training data for each type of semantic relationship, and have scalability problems when tens or hundreds of different types of relationships have to be extracted. This article discusses an approach for distantly supervised relation extraction over texts written in the Portuguese language, which uses an efficient technique for measuring similarity between relation instances, based on minwise hashing (Broder, 1997) and on locality sensitive hashing (Rajaraman e Ullman, 2011). In the proposed method, the training examples are automatically collected from Wikipedia, corresponding to sentences that express semantic relationships between pairs of entities extracted from DBPedia. These examples are represented as sets of character quadgrams and other representative elements. The sets are indexed in a data structure that implements the idea of locality-sensitive hashing. To check which semantic relationship is expressed between a given pair of entities referenced in a sentence, the most similar training examples are searched, based on an approximation to the Jaccard coefficient, obtained through min-hashing. The relation class is assigned with basis on the weighted votes of the most similar examples. Tests with a dataset from Wikipedia validate the suitability of the proposed method, showing, for instance, that the method is able to extract 10 different types of semantic relations, 8 of them corresponding to asymmetric relations, with an average score of 55.6%, measured in terms of F-1.
引用
收藏
页码:41 / 57
页数:17
相关论文
共 50 条
  • [1] DBpedia and the live extraction of structured data from Wikipedia
    Morsey, Mohamed
    Lehmann, Jens
    Auer, Soeren
    Stadler, Claus
    Hellmann, Sebastian
    PROGRAM-ELECTRONIC LIBRARY AND INFORMATION SYSTEMS, 2012, 46 (02) : 157 - 181
  • [2] Wikipedia and DBpedia for Media - Managing Audiovisual Resources in Their Semantic Context
    Evain, Jean-Pierre
    Matton, Mike
    Vaervagen, Tormod
    KNOWLEDGE GRAPHS AND LANGUAGE TECHNOLOGY, 2017, 10579 : 41 - 56
  • [3] Exploring the Geospatial Semantic Web with DBpedia Mobile
    Becker, Christian
    Bizer, Christian
    JOURNAL OF WEB SEMANTICS, 2009, 7 (04): : 278 - 286
  • [4] Wikipedia editing history in DBpedia
    Gandon, Fabien
    Boyer, Raphael
    Corby, Olivier
    Monnin, Alexandre
    2016 IEEE/WIC/ACM INTERNATIONAL CONFERENCE ON WEB INTELLIGENCE (WI 2016), 2016, : 479 - 482
  • [5] Information extraction and semantic annotation of wikipedia
    Computer Science Department, Universidad Autonoma de Madrid, Spain
    不详
    Front. Artif. Intell. Appl., 2008, 1 (145-169):
  • [6] SLHCat: Mapping Wikipedia Categories and Lists to DBpedia by Leveraging Semantic, Lexical, and Hierarchical Features
    Wang, Zhaoyi
    Zhang, Zhenyang
    Qin, Jiaxin
    Iwaihara, Mizuho
    LEVERAGING GENERATIVE INTELLIGENCE IN DIGITAL LIBRARIES: TOWARDS HUMAN-MACHINE COLLABORATION, ICADL 2023, PT I, 2023, 14457 : 133 - 148
  • [7] Automatic Extraction of Semantic Relations from Wikipedia
    Arnold, Patrick
    Rahm, Erhard
    INTERNATIONAL JOURNAL ON ARTIFICIAL INTELLIGENCE TOOLS, 2015, 24 (02)
  • [8] Semantic Sense Extraction From Wikipedia Pages
    Pirrone, Roberto
    Pipitone, Arianna
    Russo, Giuseppe
    3RD INTERNATIONAL CONFERENCE ON HUMAN SYSTEM INTERACTION, 2010, : 543 - 547
  • [9] Updating Wikipedia via DBpedia Mappings and SPARQL
    Ahmeti, Albin
    Fernandez, Javier D.
    Polleres, Axel
    Savenkov, Vadim
    SEMANTIC WEB ( ESWC 2017), PT I, 2017, 10249 : 485 - 501
  • [10] Building The Indonesian NE Dataset Using Wikipedia and DBpedia with Entities Expansion Method on DBpedia
    Alfarohmi, Haji Dito Murya
    Bijaksana, Moch. Arif
    2018 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2018, : 334 - 339