Tree-Structured Neural Networks With Topic Attention for Social Emotion Classification

被引:6
|
作者
Wang, Chang [1 ]
Wang, Bang [1 ]
Xu, Minghua [2 ]
机构
[1] HUST, Sch Elect Informat & Commun, Wuhan 430074, Hubei, Peoples R China
[2] HUST, Sch Journalism & Informat Commun, Wuhan 430074, Hubei, Peoples R China
来源
IEEE ACCESS | 2019年 / 7卷
关键词
Long short-term memory; social emotion classification; topic attention mechanism; topic model; tree-structured neural network;
D O I
10.1109/ACCESS.2019.2929204
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Social emotion classification studies the emotion distribution evoked by an article among numerous readers. Although recently neural network-based methods can improve the classification performance compared with the previous word-emotion and topic-emotion approaches, they have not fully utilized some important sentence language features and document topic features. In this paper, we propose a new neural network architecture exploiting both the syntactic information of a sentence and topic distribution of a document. The proposed architecture first constructs a tree-structured long short-term memory (Tree-LSTM) network based on the sentence syntactic dependency tree to obtain a sentence vector representation. For a multi-sentence document, we then use a Chain-LSTM network to obtain the document representation from its sentences' hidden states. Furthermore, we design a topic-based attention mechanism with two attention levels. The word-level attention is used for weighting words of a single-sentence document and the sentence-level attention for weighting sentences of a multi-sentence document. The experiments on three public datasets show that the proposed scheme outperforms the state-of-the-art ones in terms of higher average Pearson correlation coefficient and MicroF1 performance.
引用
收藏
页码:95505 / 95515
页数:11
相关论文
共 50 条
  • [1] Tree-Structured Neural Topic Model
    Isonuma, Masaru
    Mori, Junichiro
    Bollegala, Danushka
    Sakata, Ichiro
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 800 - 806
  • [2] Tree-Structured Binary Neural Networks
    Serbetci, Ayse
    Akgul, Yusuf Sinan
    29TH IEEE CONFERENCE ON SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS (SIU 2021), 2021,
  • [3] Tree-structured multilayer neural network for classification
    Shiueng-Bien Yang
    Neural Computing and Applications, 2020, 32 : 5859 - 5873
  • [4] Tree-structured multilayer neural network for classification
    Yang, Shiueng-Bien
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (10): : 5859 - 5873
  • [5] Tree-Structured Topic Modeling with Nonparametric Neural Variational Inference
    Chen, Ziye
    Ding, Cheng
    Zhang, Zusheng
    Rao, Yanghui
    Xie, Haoran
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 2343 - 2353
  • [6] Classification of Surface Defects on Cold Rolled Strip by Tree-Structured Neural Networks
    Moon, Chang In
    Choi, Se Ho
    Joo, Won Jong
    Kim, Gi Bum
    TRANSACTIONS OF THE KOREAN SOCIETY OF MECHANICAL ENGINEERS A, 2007, 31 (06) : 651 - 658
  • [7] An Attention-based Rumor Detection Model with Tree-structured Recursive Neural Networks
    Ma, Jing
    Gao, Wei
    Joty, Shafiq
    Wong, Kam-Fai
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2020, 11 (04)
  • [8] RST Discourse Parsing with Tree-Structured Neural Networks
    Zhang, Longyin
    Sun, Cheng
    Tan, Xin
    Kong, Fang
    MACHINE TRANSLATION, CWMT 2018, 2019, 954 : 15 - 26
  • [9] Shortcomings with tree-structured edge encodings for neural networks
    Hornby, GS
    GENETIC AND EVOLUTIONARY COMPUTATION GECCO 2004 , PT 2, PROCEEDINGS, 2004, 3103 : 495 - 506
  • [10] Speech Emotion Classification Using Tree-Structured Sparse Logistic Regression
    Kim, Myung Jong
    Yoo, Joohong
    Kim, Younggwan
    Kim, Hoirin
    16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 1541 - 1545