Emoji multimodal microblog sentiment analysis based on mutual attention mechanism

被引:1
|
作者
Lou, Yinxia [1 ]
Zhou, Junxiang [2 ]
Zhou, Jun [3 ]
Ji, Donghong [3 ]
Zhang, Qing [4 ]
机构
[1] Jianghan Univ, Sch Artificial Intelligence, Wuhan 430056, Peoples R China
[2] Shangqiu Normal Univ, Sch Informat Technol, Shangqiu 476000, Peoples R China
[3] Wuhan Univ, Sch Cyber Sci & Engn, Key Lab Aerosp Informat Secur & Trusted Comp, Minist Educ, Wuhan 430072, Peoples R China
[4] North China DEAN Power Engn Beijing Co Ltd, Beijing 100120, Peoples R China
来源
SCIENTIFIC REPORTS | 2024年 / 14卷 / 01期
关键词
Emoji; Mutual attention mechanism; Multimodal sentiment analysis; Multimodal fusion;
D O I
10.1038/s41598-024-80167-x
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Emojis, utilizing visual means, mimic human facial expressions and postures to convey emotions and opinions. They are widely used in social media platforms such as Sina Weibo, and have become a crucial feature for sentiment analysis. However, existing approaches often treat emojis as special symbols or convert them into text labels, thereby neglecting the rich visual information of emojis. We propose a novel multimodal information integration model for emoji microblog sentiment analysis. To effectively leverage the emoji visual information, the model employs a text-emoji visual mutual attention mechanism. Experiments on a manually annotated microblog dataset show that compared to the baseline models without incorporating emoji visual information, the proposed model achieves improvements of 1.37% in macro F1 score and 2.30% in accuracy, respectively. To facilitate the related research, our corpus will be publicly available at https://github.com/yx100/Emojis/blob/main/weibo-emojis-annotation.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] GATED MECHANISM FOR ATTENTION BASED MULTIMODAL SENTIMENT ANALYSIS
    Kumar, Ayush
    Vepa, Jithendra
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 4477 - 4481
  • [2] Multimodal Sentiment Analysis Based on Bidirectional Mask Attention Mechanism
    Zhang Y.
    Zhang H.
    Liu Y.
    Liang K.
    Wang Y.
    Data Analysis and Knowledge Discovery, 2023, 7 (04) : 46 - 55
  • [3] Emoji-Based Sentiment Analysis Using Attention Networks
    Lou, Yinxia
    Zhang, Yue
    Li, Fei
    Qian, Tao
    Ji, Donghong
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2020, 19 (05)
  • [4] Sentiment classification of microblog: A framework based on BERT and CNN with attention mechanism
    Jia, Keliang
    COMPUTERS & ELECTRICAL ENGINEERING, 2022, 101
  • [5] Multimodal sentiment analysis based on multi-head attention mechanism
    Xi, Chen
    Lu, Guanming
    Yan, Jingjie
    ICMLSC 2020: PROCEEDINGS OF THE 4TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND SOFT COMPUTING, 2020, : 34 - 39
  • [6] Multimodal Sentiment Analysis Based on Attention Mechanism and Tensor Fusion Network
    Zhang, Kang
    Geng, Yushui
    Zhao, Jing
    Li, Wenxiao
    Liu, Jianxin
    2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 1473 - 1477
  • [7] Multimodal Mutual Attention-Based Sentiment Analysis Framework Adapted to Complicated Contexts
    He, Lijun
    Wang, Ziqing
    Wang, Liejun
    Li, Fan
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2023, 33 (12) : 7131 - 7143
  • [8] Microblog sentiment analysis method based on a double attention model
    Zhang Y.
    Zheng J.
    Huang G.
    Jiang Y.
    Qinghua Daxue Xuebao/Journal of Tsinghua University, 2018, 58 (02): : 122 - 130
  • [9] Multimodal sentiment analysis based on multiple attention
    Wang, Hongbin
    Ren, Chun
    Yu, Zhengtao
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 140
  • [10] Microblog Sentiment Classification Method Based on Dual Attention Mechanism and Bidirectional LSTM
    Wei, Wenjie
    Zhang, Yangsen
    Duan, Ruixue
    Zhang, Wen
    CHINESE LEXICAL SEMANTICS (CLSW 2019), 2020, 11831 : 309 - 320