A Sentiment Analysis Method for Big Social Online Multimodal Comments Based on Pre-trained Models

被引:0
|
作者
Wan, Jun [1 ,2 ]
Wozniak, Marcin [3 ]
机构
[1] Mahasarakham Univ, Maha Sarakham, Thailand
[2] ChongQing City Vocat Coll, Chongqing, Peoples R China
[3] Silesian Tech Univ, Fac Appl Math, Gliwice, Poland
来源
MOBILE NETWORKS & APPLICATIONS | 2024年
关键词
Pre-trained model; Social multimodality; Online comments; Big data; Emotional analysis; COVID-19; CLASSIFICATION; FUSION; NETWORK;
D O I
10.1007/s11036-024-02303-1
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
In addition to a large amount of text, there are also many emoticons in the comment data on social media platforms. The multimodal nature of online comment data increases the difficulty of sentiment analysis. A big data sentiment analysis technology for social online multimodal (SOM) comments has been proposed. This technology uses web scraping technology to obtain SOM comment big data from the internet, including text data and emoji data, and then extracts and segments the text big data, preprocess part of speech tagging. Using the attention mechanism-based feature extraction method for big SOM comment data and the correlation based expression feature extraction method for SOM comment, the emotional features of SOM comment text and expression package data were obtained, respectively. Using the extracted two emotional features as inputs and the ELMO pre-training model as the basis, a GE-Bi LSTM model for SOM comment sentiment analysis is established. This model combines the ELMO pre training model with the Glove model to obtain the emotional factors of social multimodal big data. After recombining them, the GE-Bi LSTM model output layer is used to output the sentiment analysis of big SOM comment data. The experiment shows that this technology has strong extraction and segmentation capabilities for SOM comment text data, which can effectively extract emotional features contained in text data and emoji packet data, and obtain accurate emotional analysis results for big SOM comment data.
引用
收藏
页数:14
相关论文
共 50 条
  • [21] AraXLNet: pre-trained language model for sentiment analysis of Arabic
    Alduailej, Alhanouf
    Alothaim, Abdulrahman
    JOURNAL OF BIG DATA, 2022, 9 (01)
  • [22] AraXLNet: pre-trained language model for sentiment analysis of Arabic
    Alhanouf Alduailej
    Abdulrahman Alothaim
    Journal of Big Data, 9
  • [23] The Biases of Pre-Trained Language Models: An Empirical Study on Prompt-Based Sentiment Analysis and Emotion Detection
    Mao, Rui
    Liu, Qian
    He, Kai
    Li, Wei
    Cambria, Erik
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (03) : 1743 - 1753
  • [24] A System of Multimodal Image-Text Retrieval Based on Pre-Trained Models Fusion
    Li, Qiang
    Zhao, Feng
    Zhao, Linlin
    Liu, Maokai
    Wang, Yubo
    Zhang, Shuo
    Guo, Yuanyuan
    Wang, Shunlu
    Wang, Weigang
    CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2025, 37 (03):
  • [25] Sentiment Analysis for Software Engineering: How Far Can Pre-trained Transformer Models Go?
    Zhang, Ting
    Xu, Bowen
    Thung, Ferdian
    Haryono, Stefanus Agus
    Lo, David
    Jiang, Lingxiao
    2020 IEEE INTERNATIONAL CONFERENCE ON SOFTWARE MAINTENANCE AND EVOLUTION (ICSME 2020), 2020, : 70 - 80
  • [26] Neural Transfer Learning For Vietnamese Sentiment Analysis Using Pre-trained Contextual Language Models
    An Pha Le
    Tran Vu Pham
    Thanh-Van Le
    Huynh, Duy, V
    2021 IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLIED NETWORK TECHNOLOGIES (ICMLANT II), 2021, : 84 - 88
  • [27] TwitterBERT: Framework for Twitter Sentiment Analysis Based on Pre-trained Language Model Representations
    Azzouza, Noureddine
    Akli-Astouati, Karima
    Ibrahim, Roliana
    EMERGING TRENDS IN INTELLIGENT COMPUTING AND INFORMATICS: DATA SCIENCE, INTELLIGENT INFORMATION SYSTEMS AND SMART COMPUTING, 2020, 1073 : 428 - 437
  • [28] A survey of transformer-based multimodal pre-trained modals
    Han, Xue
    Wang, Yi-Tong
    Feng, Jun-Lan
    Deng, Chao
    Chen, Zhan-Heng
    Huang, Yu-An
    Su, Hui
    Hu, Lun
    Hu, Peng-Wei
    NEUROCOMPUTING, 2023, 515 : 89 - 106
  • [29] BERT for Sentiment Analysis: Pre-trained and Fine-Tuned Alternatives
    Souza, Frederico Dias
    de Oliveira e Souza Filho, Joao Baptista
    COMPUTATIONAL PROCESSING OF THE PORTUGUESE LANGUAGE, PROPOR 2022, 2022, 13208 : 209 - 218
  • [30] Roman Urdu Sentiment Analysis Using Pre-trained DistilBERT and XLNet
    Azhar, Nikhar
    Latif, Seemab
    2022 FIFTH INTERNATIONAL CONFERENCE OF WOMEN IN DATA SCIENCE AT PRINCE SULTAN UNIVERSITY (WIDS-PSU 2022), 2022, : 75 - 78