Representation of Relations by Planes in Neural Network Language Model

被引:0
|
作者
Ebisu, Takuma [1 ,2 ]
Ichise, Ryutaro [1 ,2 ,3 ]
机构
[1] SOKENDAI Grad Univ Adv Studies, Tokyo, Japan
[2] Natl Inst Informat, Tokyo, Japan
[3] Natl Inst Adv Ind Sci & Technol, Tokyo, Japan
关键词
D O I
10.1007/978-3-319-46687-3_33
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Whole brain architecture (WBA) which uses neural networks to imitate a human brain is attracting increased attention as a promising way to achieve artificial general intelligence, and distributed vector representations of words is becoming recognized as the best way to connect neural networks and knowledge. Distributed representations of words have played a wide range of roles in natural language processing, and they have become increasingly important because of their ability to capture a large amount of syntactic and lexical meanings or relationships. Relation vectors are used to represent relations between words, but this approach has some problems; some relations cannot be easily defined, for example, sibling relations, parent-child relations, and many-to-one relations. To deal with these problems, we have created a novel way of representing relations: we represent relations by planes instead of by vectors, and this increases by more than 10 % the accuracy of predicting the relation.
引用
收藏
页码:300 / 307
页数:8
相关论文
共 50 条
  • [1] Study on Distributed Representation of Words with Sparse Neural Network Language Model
    Yanagimoto, Hidekazu
    2014 IIAI 3RD INTERNATIONAL CONFERENCE ON ADVANCED APPLIED INFORMATICS (IIAI-AAI 2014), 2014, : 541 - 546
  • [2] A Neural Network model for the Evaluation of Text Complexity in Italian Language: a Representation Point of View
    Lo Bosco, Giosue
    Pilato, Giovanni
    Schicchi, Daniele
    POSTPROCEEDINGS OF THE 9TH ANNUAL INTERNATIONAL CONFERENCE ON BIOLOGICALLY INSPIRED COGNITIVE ARCHITECTURES (BICA 2018), 2018, 145 : 464 - 470
  • [3] Automatical Knowledge Representation of Logical Relations by Dynamical Neural Network
    Wang G.
    Wang, Gang (f_lag@buaa.edu.cn), 1600, Walter de Gruyter GmbH (26): : 625 - 639
  • [4] Neural Network Language Model with Cache
    Soutner, Daniel
    Loose, Zdenek
    Mueller, Ludek
    Prazak, Ales
    TEXT, SPEECH AND DIALOGUE, TSD 2012, 2012, 7499 : 528 - 534
  • [6] An Improved Recurrent Neural Network Language Model for Programming Language
    Wu, Liwei
    Wu, Youhua
    Li, Fei
    Zheng, Tao
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [7] Recurrent neural network based language model
    Mikolov, Tomas
    Karafiat, Martin
    Burget, Lukas
    Cernocky, Jan Honza
    Khudanpur, Sanjeev
    11TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION 2010 (INTERSPEECH 2010), VOLS 1-2, 2010, : 1045 - 1048
  • [8] EXTENSIONS OF RECURRENT NEURAL NETWORK LANGUAGE MODEL
    Mikolov, Tomas
    Kombrink, Stefan
    Burget, Lukas
    Cernocky, Jan Honza
    Khudanpur, Sanjeev
    2011 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2011, : 5528 - 5531
  • [9] TEMPORAL KERNEL NEURAL NETWORK LANGUAGE MODEL
    Shi, YongZhe
    Zhang, Wei-Qiang
    Cai, Meng
    Liu, Jia
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 8247 - 8251
  • [10] MARKOV RECURRENT NEURAL NETWORK LANGUAGE MODEL
    Chien, Jen-Tzung
    Kuo, Che-Yu
    2019 IEEE AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING WORKSHOP (ASRU 2019), 2019, : 807 - 813