Multi-head attention model for aspect level sentiment analysis

被引:8
|
作者
Zhang, Xinsheng [1 ]
Gao, Teng [1 ]
机构
[1] Xian Univ Architecture & Technol, Sch Management, Xian, Shaanxi, Peoples R China
关键词
Text sentiment classification; fine-grained sentiment analysis; attention mechanism;
D O I
10.3233/JIFS-179383
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aspect level sentiment classification task requires topical polarity classification for different description aspect. There is a polysemy in the same vocabulary, and the emotional polarity is different for different objects. Word embedding can capture semantic information but cannot adapt to the polysemy. Attention mechanism has achieved good performance in the above tasks; however, it is only able to get the degree of association between words and unable to get detailed descriptions. In this paper, the ELMOs model is used to adjust the polysemy of the word. The Transformer model is used to extract the features with the highest degree of relevance to the target object for emotional polarity classification. Our work contribution is to overcome the polysemy interference, and use the attention mechanism to model the network relationship between words, so that the model can extract important classification features according to different target words. Experiments on laptop and restaurant datasets demonstrate that our approach achieves a new state-of-the-art performance on a few benchmarks.
引用
收藏
页码:89 / 96
页数:8
相关论文
共 50 条
  • [31] Multi-head self-attention based gated graph convolutional networks for aspect-based sentiment classification
    Luwei Xiao
    Xiaohui Hu
    Yinong Chen
    Yun Xue
    Bingliang Chen
    Donghong Gu
    Bixia Tang
    Multimedia Tools and Applications, 2022, 81 : 19051 - 19070
  • [32] Multi-head self-attention based gated graph convolutional networks for aspect-based sentiment classification
    Xiao, Luwei
    Hu, Xiaohui
    Chen, Yinong
    Xue, Yun
    Chen, Bingliang
    Gu, Donghong
    Tang, Bixia
    MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (14) : 19051 - 19070
  • [33] Microblog Sentiment Analysis Based on Dynamic Character-Level and Word-Level Features and Multi-Head Self-Attention Pooling
    Yan, Shangyi
    Wang, Jingya
    Song, Zhiqiang
    FUTURE INTERNET, 2022, 14 (08):
  • [34] Bidirectional Encoder Representations from Transformers (Bert) And Serialized Multi-Layer Multi-Head Attention Feature Location Model Foraspect-Level Sentiment Analysis
    Regina, I. Anette
    Sengottuvelan, P.
    JOURNAL OF ALGEBRAIC STATISTICS, 2022, 13 (02) : 1391 - 1406
  • [35] On the diversity of multi-head attention
    Li, Jian
    Wang, Xing
    Tu, Zhaopeng
    Lyu, Michael R.
    NEUROCOMPUTING, 2021, 454 : 14 - 24
  • [36] A Multi-Hop Attention Deep Model for Aspect-Level Sentiment Classification
    Deng Y.
    Lei H.
    Li X.-Y.
    Lin Y.-O.
    Dianzi Keji Daxue Xuebao/Journal of the University of Electronic Science and Technology of China, 2019, 48 (05): : 759 - 766
  • [37] Target-Based Attention Model for Aspect-Level Sentiment Analysis
    Chen, Wei
    Yu, Wenxin
    Zhang, Zhiqiang
    Zhang, Yunye
    Xu, Kepeng
    Zhang, Fengwei
    Fan, Yibo
    He, Gang
    Yang, Zhuo
    NEURAL INFORMATION PROCESSING (ICONIP 2019), PT III, 2019, 11955 : 259 - 269
  • [38] An Interactive Graph Attention Networks Model for Aspect-level Sentiment Analysis
    Han Hu
    Wu Yuanhang
    Qin Xiaoya
    JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2021, 43 (11) : 3282 - 3290
  • [39] Multi-Attention Network for Aspect Sentiment Analysis
    Han, Huiyu
    Li, Xiaoge
    Zhi, Shuting
    Wang, Haoyue
    2019 8TH INTERNATIONAL CONFERENCE ON SOFTWARE AND COMPUTER APPLICATIONS (ICSCA 2019), 2019, : 22 - 26
  • [40] Automatic scene generation using sentiment analysis and bidirectional recurrent neural network with multi-head attention
    Dharaniya, R.
    Indumathi, J.
    Uma, G. V.
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (19): : 16945 - 16958