Multi-head attention model for aspect level sentiment analysis

被引:8
|
作者
Zhang, Xinsheng [1 ]
Gao, Teng [1 ]
机构
[1] Xian Univ Architecture & Technol, Sch Management, Xian, Shaanxi, Peoples R China
关键词
Text sentiment classification; fine-grained sentiment analysis; attention mechanism;
D O I
10.3233/JIFS-179383
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Aspect level sentiment classification task requires topical polarity classification for different description aspect. There is a polysemy in the same vocabulary, and the emotional polarity is different for different objects. Word embedding can capture semantic information but cannot adapt to the polysemy. Attention mechanism has achieved good performance in the above tasks; however, it is only able to get the degree of association between words and unable to get detailed descriptions. In this paper, the ELMOs model is used to adjust the polysemy of the word. The Transformer model is used to extract the features with the highest degree of relevance to the target object for emotional polarity classification. Our work contribution is to overcome the polysemy interference, and use the attention mechanism to model the network relationship between words, so that the model can extract important classification features according to different target words. Experiments on laptop and restaurant datasets demonstrate that our approach achieves a new state-of-the-art performance on a few benchmarks.
引用
收藏
页码:89 / 96
页数:8
相关论文
共 50 条
  • [21] Attention-Enhanced Graph Convolutional Networks for Aspect-Based Sentiment Classification with Multi-Head Attention
    Xu, Guangtao
    Liu, Peiyu
    Zhu, Zhenfang
    Liu, Jie
    Xu, Fuyong
    APPLIED SCIENCES-BASEL, 2021, 11 (08):
  • [22] Affective-Knowledge-Enhanced Graph Convolutional Networks for Aspect-Based Sentiment Analysis with Multi-Head Attention
    Cui, Xiaodong
    Tao, Wenbiao
    Cui, Xiaohui
    APPLIED SCIENCES-BASEL, 2023, 13 (07):
  • [23] Video sentiment analysis with bimodal information-augmented multi-head attention
    Wu, Ting
    Peng, Junjie
    Zhang, Wenqiang
    Zhang, Huiran
    Tan, Shuhua
    Yi, Fen
    Ma, Chuanshuai
    Huang, Yansong
    KNOWLEDGE-BASED SYSTEMS, 2022, 235
  • [24] Sentiment Analysis with An Integrated Model of BERT and Bi-LSTM Based on Multi-Head Attention Mechanism
    Wang, Yahui
    Cheng, Xiaoqing
    Meng, Xuelei
    IAENG International Journal of Computer Science, 2023, 50 (01)
  • [25] A Multi-Attention Network for Aspect-Level Sentiment Analysis
    Zhang, Qiuyue
    Lu, Ran
    FUTURE INTERNET, 2019, 11 (07):
  • [26] Position-Enhanced Multi-Head Self-Attention Based Bidirectional Gated Recurrent Unit for Aspect-Level Sentiment Classification
    Li, Xianyong
    Ding, Li
    Du, Yajun
    Fan, Yongquan
    Shen, Fashan
    FRONTIERS IN PSYCHOLOGY, 2022, 12
  • [27] Diversifying Multi-Head Attention in the Transformer Model
    Ampazis, Nicholas
    Sakketou, Flora
    MACHINE LEARNING AND KNOWLEDGE EXTRACTION, 2024, 6 (04): : 2618 - 2638
  • [28] Multi-Task Multi-Head Attention Memory Network for Fine-Grained Sentiment Analysis
    Dai, Zehui
    Dai, Wei
    Liu, Zhenhua
    Rao, Fengyun
    Chen, Huajie
    Zhang, Guangpeng
    Ding, Yadong
    Liu, Jiyang
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 609 - 620
  • [29] Sentiment Analysis Using Multi-Head Attention Capsules With Multi-Channel CNN and Bidirectional GRU
    Cheng, Yan
    Sun, Huan
    Chen, Haomai
    Li, Meng
    Cai, Yingying
    Cai, Zhuang
    Huang, Jing
    IEEE ACCESS, 2021, 9 : 60383 - 60395
  • [30] Short Text Sentiment Analysis Based on Multi-Channel CNN With Multi-Head Attention Mechanism
    Feng, Yue
    Cheng, Yan
    IEEE ACCESS, 2021, 9 : 19854 - 19863