Attention-based multimodal contextual fusion for sentiment and emotion classification using bidirectional LSTM

被引:0
|
作者
Mahesh G. Huddar
Sanjeev S. Sannakki
Vijay S. Rajpurohit
机构
[1] Hirasugar Institute of Technology,Department of Computer Science and Engineering
[2] Gogte Institute of Technology,Department of Computer Science and Engineering
来源
关键词
Multimodal fusion; Contextual information; Attention model; Bidirectional LSTM;
D O I
暂无
中图分类号
学科分类号
摘要
Due to the availability of an enormous amount of multimodal content on the social web and its applications, automatic sentiment analysis, and emotion detection has become an important and widely researched topic. Improving the quality of multimodal fusion is an important issue in this field of research. In this paper, we present a novel attention-based multimodal contextual fusion strategy, which extract the contextual information among the utterances before fusion. Initially, we fuse two-two modalities at a time and finally, we fuse all three modalities. We use a bidirectional LSTM with an attention model for extracting important contextual information among the utterances. The proposed model was tested on IEMOCAP dataset for emotion classification and CMU-MOSI dataset for sentiment classification. By incorporating the contextual information among utterances in the same video, our proposed method outperforms the existing methods by over 3% in emotion classification and over 2% in sentiment classification.
引用
收藏
页码:13059 / 13076
页数:17
相关论文
共 50 条
  • [41] Emotion classification of Indonesian Tweets using Bidirectional LSTM
    Glenn, Aaron
    LaCasse, Phillip
    Cox, Bruce
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (13): : 9567 - 9578
  • [42] Emotion classification of Indonesian Tweets using Bidirectional LSTM
    Aaron Glenn
    Phillip LaCasse
    Bruce Cox
    Neural Computing and Applications, 2023, 35 : 9567 - 9578
  • [43] Favorite Video Classification Based on Multimodal Bidirectional LSTM
    Ogawa, Takahiro
    Sasaka, Yuma
    Maeda, Keisuke
    Haseyama, Miki
    IEEE ACCESS, 2018, 6 : 61401 - 61409
  • [44] CoBiCo: A model using multi-stage ConvNet with attention-based Bi-LSTM for efficient sentiment classification
    Ranjan, Roop
    Daniel, A. K.
    INTERNATIONAL JOURNAL OF KNOWLEDGE-BASED AND INTELLIGENT ENGINEERING SYSTEMS, 2023, 27 (01) : 1 - 24
  • [45] Attention-based Emotion-assisted Sentiment Forecasting in Dialogue
    Zou, Congrui
    Yin, Yunfei
    Huang, Faliang
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [46] Attention-Based Memory Network for Text Sentiment Classification
    Han, Hu
    Liu, Jin
    Liu, Guoli
    IEEE ACCESS, 2018, 6 : 68302 - 68310
  • [47] Multimodal Sentiment Analysis Based on Bidirectional Mask Attention Mechanism
    Zhang Y.
    Zhang H.
    Liu Y.
    Liang K.
    Wang Y.
    Data Analysis and Knowledge Discovery, 2023, 7 (04) : 46 - 55
  • [48] AB-LSTM: Attention-based Bidirectional LSTM Model for Scene Text Detection
    Liu, Zhandong
    Zhou, Wengang
    Li, Houqiang
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2019, 15 (04)
  • [49] MultiEMO: An Attention-Based Correlation-Aware Multimodal Fusion Framework for Emotion Recognition in Conversations
    Shi, Tao
    Huang, Shao-Lun
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 14752 - 14766
  • [50] Sentiment analysis based on aspect and context fusion using attention encoder with LSTM
    Soni J.
    Mathur K.
    International Journal of Information Technology, 2022, 14 (7) : 3611 - 3618